955 resultados para Extended random set


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Previous mathematical models for hepatic and tissue one-carbon metabolism have been combined and extended to include a blood plasma compartment. We use this model to study how the concentrations of metabolites that can be measured in the plasma are related to their respective intracellular concentrations. METHODS: The model consists of a set of ordinary differential equations, one for each metabolite in each compartment, and kinetic equations for metabolism and for transport between compartments. The model was validated by comparison to a variety of experimental data such as the methionine load test and variation in folate intake. We further extended this model by introducing random and systematic variation in enzyme activity. OUTCOMES AND CONCLUSIONS: A database of 10,000 virtual individuals was generated, each with a quantitatively different one-carbon metabolism. Our population has distributions of folate and homocysteine in the plasma and tissues that are similar to those found in the NHANES data. The model reproduces many other sets of clinical data. We show that tissue and plasma folate is highly correlated, but liver and plasma folate much less so. Oxidative stress increases the plasma S-adenosylmethionine/S-adenosylhomocysteine (SAM/SAH) ratio. We show that many relationships among variables are nonlinear and in many cases we provide explanations. Sampling of subpopulations produces dramatically different apparent associations among variables. The model can be used to simulate populations with polymorphisms in genes for folate metabolism and variations in dietary input.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper theoretically analysis the recently proposed "Extended Partial Least Squares" (EPLS) algorithm. After pointing out some conceptual deficiencies, a revised algorithm is introduced that covers the middle ground between Partial Least Squares and Principal Component Analysis. It maximises a covariance criterion between a cause and an effect variable set (partial least squares) and allows a complete reconstruction of the recorded data (principal component analysis). The new and conceptually simpler EPLS algorithm has successfully been applied in detecting and diagnosing various fault conditions, where the original EPLS algorithm did only offer fault detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Course Scheduling consists of assigning lecture events to a limited set of specific timeslots and rooms. The objective is to satisfy as many soft constraints as possible, while maintaining a feasible solution timetable. The most successful techniques to date require a compute-intensive examination of the solution neighbourhood to direct searches to an optimum solution. Although they may require fewer neighbourhood moves than more exhaustive techniques to gain comparable results, they can take considerably longer to achieve success. This paper introduces an extended version of the Great Deluge Algorithm for the Course Timetabling problem which, while avoiding the problem of getting trapped in local optima, uses simple Neighbourhood search heuristics to obtain solutions in a relatively short amount of time. The paper presents results based on a standard set of benchmark datasets, beating over half of the currently published best results with in some cases up to 60% of an improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new calibration curve for the conversion of radiocarbon ages to calibrated (cal) ages has been constructed and internationally ratified to replace IntCal98, which extended from 0-24 cal kyr BP (Before Present, 0 cal BP = AD 1950). The new calibration data set for terrestrial samples extends from 0-26 cal kyr BP, but with much higher resolution beyond 11.4 cal kyr BP than IntCal98. Dendrochronologically-dated tree-ring samples cover the period from 0-12.4 cal kyr BP. Beyond the end of the tree rings, data from marine records (corals and foraminifera) are converted to the atmospheric equivalent with a site-specific marine reservoir correction to provide terrestrial calibration from 12.4-26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a coherent statistical approach based on a random walk model, which takes into account the uncertainty in both the calendar age and the (super 14) C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The tree-ring data sets, sources of uncertainty, and regional offsets are discussed here. The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed in brief, but details are presented in Hughen et al. (this issue a). We do not make a recommendation for calibration beyond 26 cal kyr BP at this time; however, potential calibration data sets are compared in another paper (van der Plicht et al., this issue).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A complex number lambda is called an extended eigenvalue of a bounded linear operator T on a Banach space B if there exists a non-zero bounded linear operator X acting on B such that XT = lambda TX. We show that there are compact quasinilpotent operators on a separable Hilbert space, for which the set of extended eigenvalues is the one-point set {1}.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Use of the Dempster-Shafer (D-S) theory of evidence to deal with uncertainty in knowledge-based systems has been widely addressed. Several AI implementations have been undertaken based on the D-S theory of evidence or the extended theory. But the representation of uncertain relationships between evidence and hypothesis groups (heuristic knowledge) is still a major problem. This paper presents an approach to representing such knowledge, in which Yen’s probabilistic multi-set mappings have been extended to evidential mappings, and Shafer’s partition technique is used to get the mass function in a complex evidence space. Then, a new graphic method for describing the knowledge is introduced which is an extension of the graphic model by Lowrance et al. Finally, an extended framework for evidential reasoning systems is specified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the relations between extended incidence calculus and assumption-based truth maintenance systems (ATMSs). We first prove that managing labels for statements (nodes) in an ATMS is equivalent to producing incidence sets of these statements in extended incidence calculus. We then demonstrate that the justification set for a node is functionally equivalent to the implication relation set for the same node in extended incidence calculus. As a consequence, extended incidence calculus can provide justifications for an ATMS, because implication relation sets are discovered by the system automatically. We also show that extended incidence calculus provides a theoretical basis for constructing a probabilistic ATMS by associating proper probability distributions on assumptions. In this way, we can not only produce labels for all nodes in the system, but also calculate the probability of any of such nodes in it. The nogood environments can also be obtained automatically. Therefore, extended incidence calculus and the ATMS are equivalent in carrying out inferences at both the symbolic level and the numerical level. This extends a result due to Laskey and Lehner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present the application of Hidden Conditional Random Fields (HCRFs) to modelling speech for visual speech recognition. HCRFs may be easily adapted to model long range dependencies across an observation sequence. As a result visual word recognition performance can be improved as the model is able to take more of a contextual approach to generating state sequences. Results are presented from a speaker-dependent, isolated digit, visual speech recognition task using comparisons with a baseline HMM system. We firstly illustrate that word recognition rates on clean video using HCRFs can be improved by increasing the number of past and future observations being taken into account by each state. Secondly we compare model performances using various levels of video compression on the test set. As far as we are aware this is the first attempted use of HCRFs for visual speech recognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiocarbon dating has been used infrequently as a chronological tool for research in Anglo-Saxon archaeology. Primarily, this is because the uncertainty of calibrated dates provides little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology in conjunction with high-precision 14C dating have, however, created the possibility of both testing and refining the established Anglo-Saxon chronologies based on typology of artifacts. The calibration process within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have previously reported decadal measurements on a section of the Irish oak chronology for the period AD 495–725 (McCormac et al. 2004). In this paper, we present decadal measurements for the periods AD 395–485 and AD 735–805,which extends the original calibration set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Side-channel attacks (SCA) threaten electronic cryptographic devices and can be carried out by monitoring the physical characteristics of security circuits. Differential Power Analysis (DPA) is one the most widely studied side-channel attacks. Numerous countermeasure techniques, such as Random Delay Insertion (RDI), have been proposed to reduce the risk of DPA attacks against cryptographic devices. The RDI technique was first proposed for microprocessors but it was shown to be unsuccessful when implemented on smartcards as it was vulnerable to a variant of the DPA attack known as the Sliding-Window DPA attack.Previous research by the authors investigated the use of the RDI countermeasure for Field Programmable Gate Array (FPGA) based cryptographic devices. A split-RDI technique wasproposed to improve the security of the RDI countermeasure. A set of critical parameters wasalso proposed that could be utilized in the design stage to optimize a security algorithm designwith RDI in terms of area, speed and power. The authors also showed that RDI is an efficientcountermeasure technique on FPGA in comparison to other countermeasures.In this article, a new RDI logic design is proposed that can be used to cost-efficiently implementRDI on FPGA devices. Sliding-Window DPA and realignment attacks, which were shown to beeffective against RDI implemented on smartcard devices, are performed on the improved RDIFPGA implementation. We demonstrate that these attacks are unsuccessful and we also proposea realignment technique that can be used to demonstrate the weakness of RDI implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an exchange rate model that is a hybrid of the conventional specification with monetary fundamentals and the Evans–Lyons microstructure approach. We estimate a model augmented with order flow variables, using a unique data set: almost 100 monthly observations on interdealer order flow on dollar/euro and dollar/yen. The augmented macroeconomic, or “hybrid,” model exhibits greater in-sample stability and out of sample forecasting improvement vis-à-vis the basic macroeconomic and random walk specifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the discrete choice model-paradigm of Random Regret Minimization (RRM) to the field of environmental and resource economics. The RRM-approach has been very recently developed in the context of travel demand modelling and presents a tractable, regret-based alternative to the dominant choice-modelling paradigm based on Random Utility Maximization-theory (RUM-theory). We highlight how RRM-based models provide closed form, logit-type formulations for choice probabilities that allow for capturing semi-compensatory behaviour and choice set-composition effects while being equally parsimonious as their utilitarian counterparts. Using data from a Stated Choice-experiment aimed at identifying valuations of characteristics of nature parks, we compare RRM-based models and RUM-based models in terms of parameter estimates, goodness of fit, elasticities and consequential policy implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multivariate Fokker-Planck-type kinetic equation modeling a test - panicle weakly interacting with an electrostatic plasma. in the presence of a magnetic field B . is analytically solved in an Ornstein - Uhlenbeck - type approximation. A new set of analytic expressions are obtained for variable moments and panicle density as a function of time. The process is diffusive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sparse representation based visual tracking approaches have attracted increasing interests in the community in recent years. The main idea is to linearly represent each target candidate using a set of target and trivial templates while imposing a sparsity constraint onto the representation coefficients. After we obtain the coefficients using L1-norm minimization methods, the candidate with the lowest error, when it is reconstructed using only the target templates and the associated coefficients, is considered as the tracking result. In spite of promising system performance widely reported, it is unclear if the performance of these trackers can be maximised. In addition, computational complexity caused by the dimensionality of the feature space limits these algorithms in real-time applications. In this paper, we propose a real-time visual tracking method based on structurally random projection and weighted least squares techniques. In particular, to enhance the discriminative capability of the tracker, we introduce background templates to the linear representation framework. To handle appearance variations over time, we relax the sparsity constraint using a weighed least squares (WLS) method to obtain the representation coefficients. To further reduce the computational complexity, structurally random projection is used to reduce the dimensionality of the feature space while preserving the pairwise distances between the data points in the feature space. Experimental results show that the proposed approach outperforms several state-of-the-art tracking methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To develop and manufacture both immediate and sustained release vaginal tablets containing the anticancer drug disulfiram, which has the potential to be used as a non-invasive treatment for cervical cancer. 
Methods: Disulfiram-loaded vaginal tablets were manufactured at pilot scale using the direct compression method. These tablets were tested in accordance with the European Pharmacopeia testing of solid dosage form guidelines. They were also tested using a biorelevant dissolution method as well as a dual-chambered release model designed to better mimic the dynamic nature of the vaginal vault.
Key findings: We have developed both immediate and sustained release vaginal tablets, which when manufactured at pilot scale are within the limits set by the European Pharmacopeia for the testing of solid dosage forms. Furthermore, these tablets are capable of releasing disulfiram in vitro using the dual-chambered release model at levels 25 000 times and 35 000 times greater than its IC50 concentration for the HeLa cervical cancer cell line. 
Conclusions: The successful pilot manufacture and testing of both the immediate and sustained release disulfiram-loaded vaginal tablets warrant further investigation, using an in-vivo model, to assess their potential for use as a non-invasive treatment option for cervical cancer.