123 resultados para Gaussian extended cubature formula
Resumo:
Hemopoietic progenitor cells express clustered homeobox (Hox) genes in a pattern characteristic of their lineage and stage of differentiation. In general, HOX expression tends to be higher in more primitive and lower in lineage-committed cells. These trends have led to the hypothesis that self-renewal of hemopoietic stem/progenitor cells is HOX-dependent and that dysregulated HOX expression underlies maintenance of the leukemia-initiating cell. Gene expression profile studies support this hypothesis and specifically highlight the importance of the HOXA cluster in hemopoiesis and leukemogenesis. Within this cluster HOXA6 and HOXA9 are highly expressed in patients with acute myeloid leukemia and form part of the "Hox code" identified in murine models of this disease. We have examined endogenous expression of Hoxa6 and Hoxa9 in purified primary progenitors as well as four growth factor-dependent cell lines FDCP-Mix, EML, 32Dcl3, and Ba/F3, representative of early multipotential and later committed precursor cells respectively. Hoxa6 was consistently higher expressed than Hoxa9, preferentially expressed in primitive cells and was both growth-factor and cell-cycle regulated. Enforced overexpression of HOXA6 or HOXA9 in FDCP-Mix resulted in increased proliferation and colony formation but had negligible effect on differentiation. In both FDCP-Mix and the more committed Ba/F3 precursor cells overexpression of HOXA6 potentiated factor-independent proliferation. These findings demonstrate that Hoxa6 is directly involved in fundamental processes of hemopoietic progenitor cell development.
Resumo:
Use of the Dempster-Shafer (D-S) theory of evidence to deal with uncertainty in knowledge-based systems has been widely addressed. Several AI implementations have been undertaken based on the D-S theory of evidence or the extended theory. But the representation of uncertain relationships between evidence and hypothesis groups (heuristic knowledge) is still a major problem. This paper presents an approach to representing such knowledge, in which Yen’s probabilistic multi-set mappings have been extended to evidential mappings, and Shafer’s partition technique is used to get the mass function in a complex evidence space. Then, a new graphic method for describing the knowledge is introduced which is an extension of the graphic model by Lowrance et al. Finally, an extended framework for evidential reasoning systems is specified.
Resumo:
This paper discusses the relations between extended incidence calculus and assumption-based truth maintenance systems (ATMSs). We first prove that managing labels for statements (nodes) in an ATMS is equivalent to producing incidence sets of these statements in extended incidence calculus. We then demonstrate that the justification set for a node is functionally equivalent to the implication relation set for the same node in extended incidence calculus. As a consequence, extended incidence calculus can provide justifications for an ATMS, because implication relation sets are discovered by the system automatically. We also show that extended incidence calculus provides a theoretical basis for constructing a probabilistic ATMS by associating proper probability distributions on assumptions. In this way, we can not only produce labels for all nodes in the system, but also calculate the probability of any of such nodes in it. The nogood environments can also be obtained automatically. Therefore, extended incidence calculus and the ATMS are equivalent in carrying out inferences at both the symbolic level and the numerical level. This extends a result due to Laskey and Lehner.
Resumo:
We study state engineering through bilinear interactions between two remote qubits and two-mode Gaussian light fields. The attainable two-qubit states span the entire physically allowed region in the entanglement-versus-global-purity plane. Two-mode Gaussian states with maximal entanglement at fixed global and marginal entropies produce maximally entangled two-qubit states in the corresponding entropic diagram. We show that a small set of parameters characterizing extremally entangled two-mode Gaussian states is sufficient to control the engineering of extremally entangled two-qubit states, which can be realized in realistic matter-light scenarios.
Resumo:
Radiocarbon dating has been used infrequently as a chronological tool for research in Anglo-Saxon archaeology. Primarily, this is because the uncertainty of calibrated dates provides little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology in conjunction with high-precision 14C dating have, however, created the possibility of both testing and refining the established Anglo-Saxon chronologies based on typology of artifacts. The calibration process within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have previously reported decadal measurements on a section of the Irish oak chronology for the period AD 495–725 (McCormac et al. 2004). In this paper, we present decadal measurements for the periods AD 395–485 and AD 735–805,which extends the original calibration set.
Resumo:
The monitoring of multivariate systems that exhibit non-Gaussian behavior is addressed. Existing work advocates the use of independent component analysis (ICA) to extract the underlying non-Gaussian data structure. Since some of the source signals may be Gaussian, the use of principal component analysis (PCA) is proposed to capture the Gaussian and non-Gaussian source signals. A subsequent application of ICA then allows the extraction of non-Gaussian components from the retained principal components (PCs). A further contribution is the utilization of a support vector data description to determine a confidence limit for the non-Gaussian components. Finally, a statistical test is developed for determining how many non-Gaussian components are encapsulated within the retained PCs, and associated monitoring statistics are defined. The utility of the proposed scheme is demonstrated by a simulation example, and the analysis of recorded data from an industrial melter.
Resumo:
The stochastic nature of oil price fluctuations is investigated over a twelve-year period, borrowing feedback from an existing database (USA Energy Information Administration database, available online). We evaluate the scaling exponents of the fluctuations by employing different statistical analysis methods, namely rescaled range analysis (R/S), scale windowed variance analysis (SWV) and the generalized Hurst exponent (GH) method. Relying on the scaling exponents obtained, we apply a rescaling procedure to investigate the complex characteristics of the probability density functions (PDFs) dominating oil price fluctuations. It is found that PDFs exhibit scale invariance, and in fact collapse onto a single curve when increments are measured over microscales (typically less than 30 days). The time evolution of the distributions is well fitted by a Levy-type stable distribution. The relevance of a Levy distribution is made plausible by a simple model of nonlinear transfer. Our results also exhibit a degree of multifractality as the PDFs change and converge toward to a Gaussian distribution at the macroscales.