117 resultados para CONTINUOUS CHARACTERS
Resumo:
Conditional Gaussian (CG) distributions allow the inclusion of both discrete and continuous variables in a model assuming that the continuous variable is normally distributed. However, the CG distributions have proved to be unsuitable for survival data which tends to be highly skewed. A new method of analysis is required to take into account continuous variables which are not normally distributed. The aim of this paper is to introduce the more appropriate conditional phase-type (C-Ph) distribution for representing a continuous non-normal variable while also incorporating the causal information in the form of a Bayesian network.
Resumo:
The ionic liquid, tributylmethylammonium methylcarbonate, has been employed as a catalytic base for clean N-methylation of indole with dimethylcarbonate. The reaction conditions were optimised under microwave heating to give 100% conversion and 100% selectivity to N-methylindole, and subsequently transferred to a high temperature/high pressure (285 degrees C/150 bar) continuous flow process using a short (3 min) residence time and 2 mol% of the catalyst to efficiently methylate a variety of different amines, phenols, thiophenols and carboxylic acid substrates. The extremely short residence times, versatility, and high selectivity have significant implications for the synthesis of a wide range of pharmaceutical intermediates, as high product throughputs can be obtained via this scalable continuous flow protocol. It has also been shown that the ionic liquid can be generated in situ from tributylamine, which has the net effect of transforming an ineffective stoichiometric base into a highly efficient catalyst for this broad class of reactions.
Resumo:
The combination of milli-scale processing and microwave heating has been investigated for the Cu-catalyzed Ullmann etherification in fine-chemical synthesis, providing improved catalytic activity and selective catalyst heating. Wall-coated and fixed-bed milli-reactors were designed and applied in the Cu-catalyzed Ullmann-type CO coupling of phenol and 4-chloropyridine. In a batch reactor the results show clearly increased yields for the microwave heated process at low microwave powers, whereas high powers and catalyst loadings reduced the benefits of microwave heating. Slightly higher yields were found in the Cu/ZnO wall-coated as compared to the Cu/TiO fixed-bed flow-reactor. The benefit here is that the reaction occurs at the surface of the metal nanoparticles confined within a support film making the nano-copper equally accessible. Catalyst deactivation was mainly caused by Cu oxidation and coke formation; however, at longer process times leaching played a significant role. Catalyst activity could partially be recovered by removal of deposited by-product by means of calcination. After 6h on-stream the reactor productivities were 28.3 and 55.1kgprod/(mR3h) for the fresh Cu/ZnO wall-coated and Cu/TiO fixed-bed reactor, respectively. Comparison of single- and multimode microwaves showed a threefold yield increase for single-mode microwaves. Control of nanoparticles size and loading allows to avoid high temperatures in a single-mode microwave field and provides a novel solution to a major problem for combining metal catalysis and microwave heating. Catalyst stability appeared to be more important and provided twofold yield increase for the CuZn/TiO catalyst as compared to the Cu/TiO catalyst due to stabilized copper by preferential oxidation of the zinc. For this catalyst a threefold yield increase was observed in single-mode microwaves which, to the best of our knowledge, led to a not yet reported productivity of 172kgprod/(mR3h) for the microwave and flow Ullmann CO coupling. © 2012 Elsevier B.V.
Resumo:
Data obtained with any research tool must be reproducible, a concept referred to as reliability. Three techniques are often used to evaluate reliability of tools using continuous data in aging research: intraclass correlation coefficients (ICC), Pearson correlations, and paired t tests. These are often construed as equivalent when applied to reliability. This is not correct, and may lead researchers to select instruments based on statistics that may not reflect actual reliability. The purpose of this paper is to compare the reliability estimates produced by these three techniques and determine the preferable technique. A hypothetical dataset was produced to evaluate the reliability estimates obtained with ICC, Pearson correlations, and paired t tests in three different situations. For each situation two sets of 20 observations were created to simulate an intrarater or inter-rater paradigm, based on 20 participants with two observations per participant. Situations were designed to demonstrate good agreement, systematic bias, or substantial random measurement error. In the situation demonstrating good agreement, all three techniques supported the conclusion that the data were reliable. In the situation demonstrating systematic bias, the ICC and t test suggested the data were not reliable, whereas the Pearson correlation suggested high reliability despite the systematic discrepancy. In the situation representing substantial random measurement error where low reliability was expected, the ICC and Pearson coefficient accurately illustrated this. The t test suggested the data were reliable. The ICC is the preferred technique to measure reliability. Although there are some limitations associated with the use of this technique, they can be overcome.
Resumo:
We introduce a family of Hamiltonian systems for measurement-based quantum computation with continuous variables. The Hamiltonians (i) are quadratic, and therefore two body, (ii) are of short range, (iii) are frustration-free, and (iv) possess a constant energy gap proportional to the squared inverse of the squeezing. Their ground states are the celebrated Gaussian graph states, which are universal resources for quantum computation in the limit of infinite squeezing. These Hamiltonians constitute the basic ingredient for the adiabatic preparation of graph states and thus open new venues for the physical realization of continuous-variable quantum computing beyond the standard optical approaches. We characterize the correlations in these systems at thermal equilibrium. In particular, we prove that the correlations across any multipartition are contained exactly in its boundary, automatically yielding a correlation area law. © 2011 American Physical Society.
Resumo:
Let T be a compact disjointness preserving linear operator from C0(X) into C0(Y), where X and Y are locally compact Hausdorff spaces. We show that T can be represented as a norm convergent countable sum of disjoint rank one operators. More precisely, T = Snd ?hn for a (possibly finite) sequence {xn }n of distinct points in X and a norm null sequence {hn }n of mutually disjoint functions in C0(Y). Moreover, we develop a graph theoretic method to describe the spectrum of such an operator
Resumo:
Emotion research has long been dominated by the “standard method” of displaying posed or acted static images of facial expressions of emotion. While this method has been useful it is unable to investigate the dynamic nature of emotion expression. Although continuous self-report traces have enabled the measurement of dynamic expressions of emotion, a consensus has not been reached on the correct statistical techniques that permit inferences to be made with such measures. We propose Generalized Additive Models and Generalized Additive Mixed Models as techniques that can account for the dynamic nature of such continuous measures. These models allow us to hold constant shared components of responses that are due to perceived emotion across time, while enabling inference concerning linear differences between groups. The mixed model GAMM approach is preferred as it can account for autocorrelation in time series data and allows emotion decoding participants to be modelled as random effects. To increase confidence in linear differences we assess the methods that address interactions between categorical variables and dynamic changes over time. In addition we provide comments on the use of Generalized Additive Models to assess the effect size of shared perceived emotion and discuss sample sizes. Finally we address additional uses, the inference of feature detection, continuous variable interactions, and measurement of ambiguity.
Resumo:
OBJECTIVE - To evaluate an algorithm guiding responses of continuous subcutaneous insulin infusion (CSII)-treated type 1 diabetic patients using real-time continuous glucose monitoring (RT-CGM). RESEARCH DESIGN AND METHODS - Sixty CSII-treated type 1 diabetic participants (aged 13-70 years, including adult and adolescent subgroups, with A1C =9.5%) were randomized in age-, sex-, and A1C-matched pairs. Phase 1 was an open 16-week multicenter randomized controlled trial. Group A was treated with CSII/RT-CGM with the algorithm, and group B was treated with CSII/RT-CGM without the algorithm. The primary outcome was the difference in time in target (4-10 mmol/l) glucose range on 6-day masked CGM. Secondary outcomes were differences in A1C, low (=3.9 mmol/l) glucose CGM time, and glycemic variability. Phase 2 was the week 16-32 follow-up. Group A was returned to usual care, and group B was provided with the algorithm. Glycemia parameters were as above. Comparisons were made between baseline and 16 weeks and 32 weeks. RESULTS - In phase 1, after withdrawals 29 of 30 subjects were left in group A and 28 of 30 subjects were left in group B. The change in target glucose time did not differ between groups. A1C fell (mean 7.9% [95% CI 7.7-8.2to 7.6% [7.2-8.0]; P <0.03) in group A but not in group B (7.8% [7.5-8.1] to 7.7 [7.3-8.0]; NS) with no difference between groups. More subjects in group A achieved A1C =7% than those in group B (2 of 29 to 14 of 29 vs. 4 of 28 to 7 of 28; P = 0.015). In phase 2, one participant was lost from each group. In group A, A1C returned to baseline with RT-CGM discontinuation but did not change in group B, who continued RT-CGM with addition of the algorithm. CONCLUSIONS - Early but not late algorithm provision to type 1 diabetic patients using CSII/RT-CGM did not increase the target glucose time but increased achievement of A1C =7%. Upon RT-CGM cessation, A1C returned to baseline. © 2010 by the American Diabetes Association.
Resumo:
Sponge classification has long been based mainly on morphocladistic analyses but is now being greatly challenged by more than 12 years of accumulated analyses of molecular data analyses. The current study used phylogenetic hypotheses based on sequence data from 18S rRNA, 28S rRNA, and the CO1 barcoding fragment, combined with morphology to justify the resurrection of the order Axinellida Lévi, 1953. Axinellida occupies a key position in different morphologically derived topologies. The abandonment of Axinellida and the establishment of Halichondrida Vosmaer, 1887 sensu lato to contain Halichondriidae Gray, 1867, Axinellidae Carter, 1875, Bubaridae Topsent, 1894, Heteroxyidae Dendy, 1905, and a new family Dictyonellidae van Soest et al., 1990 was based on the conclusion that an axially condensed skeleton evolved independently in separate lineages in preference to the less parsimonious assumption that asters (star-shaped spicules), acanthostyles (club-shaped spicules with spines), and sigmata (C-shaped spicules) each evolved more than once. Our new molecular trees are congruent and contrast with the earlier, morphologically based, trees. The results show that axially condensed skeletons, asters, acanthostyles, and sigmata are all homoplasious characters. The unrecognized homoplasious nature of these characters explains much of the incongruence between molecular-based and morphology-based phylogenies. We use the molecular trees presented here as a basis for re-interpreting the morphological characters within Heteroscleromorpha. The implications for the classification of Heteroscleromorpha are discussed and a new order Biemnida ord. nov. is erected.