921 resultados para Entropy of Tsallis
Resumo:
We report dramatic sensitivity enhancements in multidimensional MAS NMR spectra by the use of nonuniform sampling (NUS) and introduce maximum entropy interpolation (MINT) processing that assures the linearity between the time and frequency domains of the NUS acquired data sets. A systematic analysis of sensitivity and resolution in 2D and 3D NUS spectra reveals that with NUS, at least 1.5- to 2-fold sensitivity enhancement can be attained in each indirect dimension without compromising the spectral resolution. These enhancements are similar to or higher than those attained by the newest-generation commercial cryogenic probes. We explore the benefits of this NUS/MaxEnt approach in proteins and protein assemblies using 1-73-(U-C-13,N-15)/74-108-(U-N-15) Escherichia coil thioredoxin reassembly. We demonstrate that in thioredoxin reassembly, NUS permits acquisition of high-quality 3D-NCACX spectra, which are inaccessible with conventional sampling due to prohibitively long experiment times. Of critical importance, issues that hinder NUS-based SNR enhancement in 3D-NMR of liquids are mitigated in the study of solid samples in which theoretical enhancements on the order of 3-4 fold are accessible by compounding the NUS-based SNR enhancement of each indirect dimension. NUS/MINT is anticipated to be widely applicable and advantageous for multidimensional heteronuclear MAS NMR spectroscopy of proteins, protein assemblies, and other biological systems.
Resumo:
We have studied the structure and stability of (H3O+)(H2O)8 clusters using a combination of molecular dynamics sampling and high-level ab initio calculations. 20 distinct oxygen frameworks are found within 2 kcal/mol of the electronic or standard Gibbs free energy minimum. The impact of quantum zero-point vibrational corrections on the relative stability of these isomers is quite significant. The box-like isomers are favored in terms of electronic energy, but with the inclusion of zero-point vibrational corrections and entropic effects tree-like isomers are favored at higher temperatures. Under conditions from 0 to 298.15 K, the global minimum is predicted to be a tree-like structure with one dangling singly coordinated water molecule. Above 298.15 K, higher entropy tree-like isomers with two or more singly coordinated water molecules are favored. These assignments are generally consistent with experimental IR spectra of (H3O+)(H2O)8 obtained at 150 K.
Resumo:
Misconceptions exist in all fields of learning and develop through a person’s preconception of how the world works. Students with misconceptions in chemical engineering are not capable of correctly transferring knowledge to a new situation and will likely arrive at an incorrect solution. The purpose of this thesis was to repair misconceptions in thermodynamics by using inquiry-based activities. Inquiry-based learning is a method of teaching that involves hands-on learning and self-discovery. Previous work has shown inquiry-based methods result in better conceptual understanding by students relative to traditional lectures. The thermodynamics activities were designed to guide students towards the correct conceptual understanding through observing a preconception fail to hold up through an experiment or simulation. The developed activities focus on the following topics in thermodynamics: “internal energy versus enthalpy”, “equilibrium versus steady state”, and “entropy”. For each topic, two activities were designed to clarify the concept and assure it was properly grasped. Each activity was coupled with an instructions packet containing experimental procedure as well as pre- and post-analysis questions, which were used to analyze the effect of the activities on the students’ responses. Concept inventories were used to monitor students’ conceptual understanding at the beginning and end of the semester. The results did not show a statistically significant increase in the overall concept inventory scores for students who performed the activities compared to traditional learning. There was a statistically significant increase in concept area scores for “internal energy versus enthalpy” and “equilibrium versus steady state”. Although there was not a significant increase in concept inventory scores for “entropy”, written analyses showed most students’ misconceptions were repaired. Students transferred knowledge effectively and retained most of the information in the concept areas of “internal energy versus enthalpy” and “equilibrium versus steady state”.
Performance Tuning Non-Uniform Sampling for Sensitivity Enhancement of Signal-Limited Biological NMR
Resumo:
Non-uniform sampling (NUS) has been established as a route to obtaining true sensitivity enhancements when recording indirect dimensions of decaying signals in the same total experimental time as traditional uniform incrementation of the indirect evolution period. Theory and experiments have shown that NUS can yield up to two-fold improvements in the intrinsic signal-to-noise ratio (SNR) of each dimension, while even conservative protocols can yield 20-40 % improvements in the intrinsic SNR of NMR data. Applications of biological NMR that can benefit from these improvements are emerging, and in this work we develop some practical aspects of applying NUS nD-NMR to studies that approach the traditional detection limit of nD-NMR spectroscopy. Conditions for obtaining high NUS sensitivity enhancements are considered here in the context of enabling H-1,N-15-HSQC experiments on natural abundance protein samples and H-1,C-13-HMBC experiments on a challenging natural product. Through systematic studies we arrive at more precise guidelines to contrast sensitivity enhancements with reduced line shape constraints, and report an alternative sampling density based on a quarter-wave sinusoidal distribution that returns the highest fidelity we have seen to date in line shapes obtained by maximum entropy processing of non-uniformly sampled data.
Resumo:
Recently, we have demonstrated that considerable inherent sensitivity gains are attained in MAS NMR spectra acquired by nonuniform sampling (NUS) and introduced maximum entropy interpolation (MINT) processing that assures the linearity of transformation between the time and frequency domains. In this report, we examine the utility of the NUS/MINT approach in multidimensional datasets possessing high dynamic range, such as homonuclear C-13-C-13 correlation spectra. We demonstrate on model compounds and on 1-73-(U-C-13,N-15)/74-108-(U-N-15) E. coli thioredoxin reassembly, that with appropriately constructed 50 % NUS schedules inherent sensitivity gains of 1.7-2.1-fold are readily reached in such datasets. We show that both linearity and line width are retained under these experimental conditions throughout the entire dynamic range of the signals. Furthermore, we demonstrate that the reproducibility of the peak intensities is excellent in the NUS/MINT approach when experiments are repeated multiple times and identical experimental and processing conditions are employed. Finally, we discuss the principles for design and implementation of random exponentially biased NUS sampling schedules for homonuclear C-13-C-13 MAS correlation experiments that yield high-quality artifact-free datasets.
Resumo:
Assessments of environmental and territorial justice are similar in that both assess whether empirical relations between the spatial arrangement of undesirable hazards (or desirable public goods and services) and socio-demographic groups are consistent with notions of social justice, evaluating the spatial distribution of benefits and burdens (outcome equity) and the process that produces observed differences (process equity. Using proximity to major highways in NYC as a case study, we review methodological issues pertinent to both fields and discuss choice and computation of exposure measures, but focus primarily on measures of inequity. We present inequity measures computed from the empirically estimated joint distribution of exposure and demographics and compare them to traditional measures such as linear regression, logistic regression and Theil’s entropy index. We find that measures computed from the full joint distribution provide more unified, transparent and intuitive operational definitions of inequity and show how the approach can be used to structure siting and decommissioning decisions.
Resumo:
BACKGROUND: Propofol and sevoflurane display additivity for gamma-aminobutyric acid receptor activation, loss of consciousness, and tolerance of skin incision. Information about their interaction regarding electroencephalographic suppression is unavailable. This study examined this interaction as well as the interaction on the probability of tolerance of shake and shout and three noxious stimulations by using a response surface methodology. METHODS: Sixty patients preoperatively received different combined concentrations of propofol (0-12 microg/ml) and sevoflurane (0-3.5 vol.%) according to a crisscross design (274 concentration pairs, 3 to 6 per patient). After having reached pseudo-steady state, the authors recorded bispectral index, state and response entropy and the response to shake and shout, tetanic stimulation, laryngeal mask airway insertion, and laryngoscopy. For the analysis of the probability of tolerance by logistic regression, a Greco interaction model was used. For the separate analysis of bispectral index, state and response entropy suppression, a fractional Emax Greco model was used. All calculations were performed with NONMEM V (GloboMax LLC, Hanover, MD). RESULTS: Additivity was found for all endpoints, the Ce(50, PROP)/Ce(50, SEVO) for bispectral index suppression was 3.68 microg. ml(-1)/ 1.53 vol.%, for tolerance of shake and shout 2.34 microg . ml(-1)/ 1.03 vol.%, tetanic stimulation 5.34 microg . ml(-1)/ 2.11 vol.%, laryngeal mask airway insertion 5.92 microg. ml(-1) / 2.55 vol.%, and laryngoscopy 6.55 microg. ml(-1)/2.83 vol.%. CONCLUSION: For both electroencephalographic suppression and tolerance to stimulation, the interaction of propofol and sevoflurane was identified as additive. The response surface data can be used for more rational dose finding in case of sequential and coadministration of propofol and sevoflurane.
Resumo:
The extraction of the finite temperature heavy quark potential from lattice QCD relies on a spectral analysis of the Wilson loop. General arguments tell us that the lowest lying spectral peak encodes, through its position and shape, the real and imaginary parts of this complex potential. Here we benchmark this extraction strategy using leading order hard-thermal loop (HTL) calculations. In other words, we analytically calculate the Wilson loop and determine the corresponding spectrum. By fitting its lowest lying peak we obtain the real and imaginary parts and confirm that the knowledge of the lowest peak alone is sufficient for obtaining the potential. Access to the full spectrum allows an investigation of spectral features that do not contribute to the potential but can pose a challenge to numerical attempts of an analytic continuation from imaginary time data. Differences in these contributions between the Wilson loop and gauge fixed Wilson line correlators are discussed. To better understand the difficulties in a numerical extraction we deploy the maximum entropy method with extended search space to HTL correlators in Euclidean time and observe how well the known spectral function and values for the real and imaginary parts are reproduced. Possible venues for improvement of the extraction strategy are discussed.
Resumo:
The dynamics of glass is of importance in materials science but its nature has not yet been fully understood. Here we report that a verification of the temperature dependencies of the primary relaxation time or viscosity in the ultraslowing/ultraviscous domain of glass-forming systems can be carried out via the analysis of the inverse of the Dyre-Olsen temperature index. The subsequent analysis of experimental data indicates the possibility of the self-consistent description of glass-forming low-molecular-weight liquids, polymers, liquid crystals, orientationally disordered crystals and Ising spin-glass-like systems, as well as the prevalence of equations associated with the 'finite temperature divergence'. All these lead to a new formula for the configurational entropy in glass-forming systems. Furthermore, a link to the dominated local symmetry for a given glass former is identified here. Results obtained show a new relationship between the glass transition and critical phenomena.
Resumo:
Artificial pancreas is in the forefront of research towards the automatic insulin infusion for patients with type 1 diabetes. Due to the high inter- and intra-variability of the diabetic population, the need for personalized approaches has been raised. This study presents an adaptive, patient-specific control strategy for glucose regulation based on reinforcement learning and more specifically on the Actor-Critic (AC) learning approach. The control algorithm provides daily updates of the basal rate and insulin-to-carbohydrate (IC) ratio in order to optimize glucose regulation. A method for the automatic and personalized initialization of the control algorithm is designed based on the estimation of the transfer entropy (TE) between insulin and glucose signals. The algorithm has been evaluated in silico in adults, adolescents and children for 10 days. Three scenarios of initialization to i) zero values, ii) random values and iii) TE-based values have been comparatively assessed. The results have shown that when the TE-based initialization is used, the algorithm achieves faster learning with 98%, 90% and 73% in the A+B zones of the Control Variability Grid Analysis for adults, adolescents and children respectively after five days compared to 95%, 78%, 41% for random initialization and 93%, 88%, 41% for zero initial values. Furthermore, in the case of children, the daily Low Blood Glucose Index reduces much faster when the TE-based tuning is applied. The results imply that automatic and personalized tuning based on TE reduces the learning period and improves the overall performance of the AC algorithm.
Resumo:
Quantitative EEG (qEEG) has modified our understanding of epileptic seizures, shifting our view from the traditionally accepted hyper-synchrony paradigm toward more complex models based on re-organization of functional networks. However, qEEG measurements are so far rarely considered during the clinical decision-making process. To better understand the dynamics of intracranial EEG signals, we examine a functional network derived from the quantification of information flow between intracranial EEG signals. Using transfer entropy, we analyzed 198 seizures from 27 patients undergoing pre-surgical evaluation for pharmaco-resistant epilepsy. During each seizure we considered for each network the in-, out- and total "hubs", defined respectively as the time and the EEG channels with the maximal incoming, outgoing or total (bidirectional) information flow. In the majority of cases we found that the hubs occur around the middle of seizures, and interestingly not at the beginning or end, where the most dramatic EEG signal changes are found by visual inspection. For the patients who then underwent surgery, good postoperative clinical outcome was on average associated with a higher percentage of out- or total-hubs located in the resected area (for out-hubs p = 0.01, for total-hubs p = 0.04). The location of in-hubs showed no clear predictive value. We conclude that the study of functional networks based on qEEG measurements may help to identify brain areas that are critical for seizure generation and are thus potential targets for focused therapeutic interventions.
Resumo:
We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function m(ω) only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter α in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of P[ρ|D] in the full Nω » Nτ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width delta peaks and more realistic scenarios, based on the perturbative Euclidean Wilson Loop as well as the Wilson Line correlator in Coulomb gauge.
Resumo:
We report on a comprehensive signal processing procedure for very low signal levels for the measurement of neutral deuterium in the local interstellar medium from a spacecraft in Earth orbit. The deuterium measurements were performed with the IBEX-Lo camera on NASA’s Interstellar Boundary Explorer (IBEX) satellite. Our analysis technique for these data consists of creating a mass relation in three-dimensional time of flight space to accurately determine the position of the predicted D events, to precisely model the tail of the H events in the region where the H tail events are near the expected D events, and then to separate the H tail from the observations to extract the very faint D signal. This interstellar D signal, which is expected to be a few counts per year, is extracted from a strong terrestrial background signal, consisting of sputter products from the sensor’s conversion surface. As reference we accurately measure the terrestrial D/H ratio in these sputtered products and then discriminate this terrestrial background source. During the three years of the mission time when the deuterium signal was visible to IBEX, the observation geometry and orbit allowed for a total observation time of 115.3 days. Because of the spinning of the spacecraft and the stepping through eight energy channels the actual observing time of the interstellar wind was only 1.44 days. With the optimised data analysis we found three counts that could be attributed to interstellar deuterium. These results update our earlier work.
Resumo:
Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.