944 resultados para Entropy diagrams
Resumo:
By means of the time dependent density matrix renormalization group algorithm we study the zero-temperature dynamics of the Von Neumann entropy of a block of spins in a Heisenberg chain after a sudden quench in the anisotropy parameter. In the absence of any disorder the block entropy increases linearly with time and then saturates. We analyse the velocity of propagation of the entanglement as a function of the initial and final anisotropies and compare our results, wherever possible, with those obtained by means of conformal field theory. In the disordered case we find a slower ( logarithmic) evolution which may signal the onset of entanglement localization.
Resumo:
Amorphous drug-polymer solid dispersions have the potential to enhance the dissolution performance and thus bioavailability of BCS class II drug compounds. The principle drawback of this approach is the limited physical stability of amorphous drug within the dispersion. Accurate determination of the solubility and miscibility of drug in the polymer matrix is the key to the successful design and development of such systems. In this paper, we propose a novel method, based on Flory-Huggins theory, to predict and compare the solubility and miscibility of drug in polymeric systems. The systems chosen for this study are (1) hydroxypropyl methylcellulose acetate succinate HF grade (HPMCAS-HF)-felodipine (FD) and (2) Soluplus (a graft copolymer of polyvinyl caprolactam-polyvinyl acetate-polyethylene glycol)-FD. Samples containing different drug compositions were mixed, ball milled, and then analyzed by differential scanning calorimetry (DSC). The value of the drug-polymer interaction parameter ? was calculated from the crystalline drug melting depression data and extrapolated to lower temperatures. The interaction parameter ? was also calculated at 25 °C for both systems using the van Krevelen solubility parameter method. The rank order of interaction parameters of the two systems obtained at this temperature was comparable. Diagrams of drug-polymer temperature-composition and free energy of mixing (?G mix) were constructed for both systems. The maximum crystalline drug solubility and amorphous drug miscibility may be predicted based on the phase diagrams. Hyper-DSC was used to assess the validity of constructed phase diagrams by annealing solid dispersions at specific drug loadings. Three different samples for each polymer were selected to represent different regions within the phase diagram
Resumo:
Life science research aims to continuously improve the quality and standard of human life. One of the major challenges in this area is to maintain food safety and security. A number of image processing techniques have been used to investigate the quality of food products. In this paper,we propose a new algorithm to effectively segment connected grains so that each of them can be inspected in a later processing stage. One family of the existing segmentation methods is based on the idea of watersheding, and it has shown promising results in practice.However,due to the over-segmentation issue,this technique has experienced poor performance in various applications,such as inhomogeneous background and connected targets. To solve this problem,we present a combination of two classical techniques to handle this issue.In the first step,a mean shift filter is used to eliminate the inhomogeneous background, where entropy is used to be a converging criterion. Secondly,a color gradient algorithm is used in order to detect the most significant edges, and a marked watershed transform is applied to segment cluttered objects out of the previous processing stages. The proposed framework is capable of compromising among execution time, usability, efficiency and segmentation outcome in analyzing ring die pellets. The experimental results demonstrate that the proposed approach is effectiveness and robust.
Resumo:
Identifying responsibility for classes in object oriented software design phase is a crucial task. This paper proposes an approach for producing high quality and robust behavioural diagrams (e.g. Sequence Diagrams) through Class Responsibility Assignment (CRA). GRASP or General Responsibility Assignment Software Pattern (or Principle) was used to direct the CRA process when deriving behavioural diagrams. A set of tools to support CRA was developed to provide designers and developers with a cognitive toolkit that can be used when analysing and designing object-oriented software. The tool developed is called Use Case Specification to Sequence Diagrams (UC2SD). UC2SD uses a new approach for developing Unified Modelling Language (UML) software designs from Natural Language, making use of a meta-domain oriented ontology, well established software design principles and established Natural Language Processing (NLP) tools. UC2SD generates a well-formed UML sequence diagrams as output.
On the complexity of solving polytree-shaped limited memory influence diagrams with binary variables
Resumo:
Influence diagrams are intuitive and concise representations of structured decision problems. When the problem is non-Markovian, an optimal strategy can be exponentially large in the size of the diagram. We can avoid the inherent intractability by constraining the size of admissible strategies, giving rise to limited memory influence diagrams. A valuable question is then how small do strategies need to be to enable efficient optimal planning. Arguably, the smallest strategies one can conceive simply prescribe an action for each time step, without considering past decisions or observations. Previous work has shown that finding such optimal strategies even for polytree-shaped diagrams with ternary variables and a single value node is NP-hard, but the case of binary variables was left open. In this paper we address such a case, by first noting that optimal strategies can be obtained in polynomial time for polytree-shaped diagrams with binary variables and a single value node. We then show that the same problem is NP-hard if the diagram has multiple value nodes. These two results close the fixed-parameter complexity analysis of optimal strategy selection in influence diagrams parametrized by the shape of the diagram, the number of value nodes and the maximum variable cardinality.
Resumo:
We present a new algorithm for exactly solving decision making problems represented as influence diagrams. We do not require the usual assumptions of no forgetting and regularity; this allows us to solve problems with simultaneous decisions and limited information. The algorithm is empirically shown to outperform a state-of-the-art algorithm on randomly generated problems of up to 150 variables and 10^64 solutions. We show that these problems are NP-hard even if the underlying graph structure of the problem has low treewidth and the variables take on a bounded number of states, and that they admit no provably good approximation if variables can take on an arbitrary number of states.
Resumo:
We present a new algorithm for exactly solving decision making problems represented as influence diagrams. We do not require the usual assumptions of no forgetting and regularity; this allows us to solve problems with simultaneous decisions and limited information. The algorithm is empirically shown to outperform a state-of-the-art algorithm on randomly generated problems of up to 150 variables and 10^64 solutions. We show that the problem is NP-hard even if the underlying graph structure of the problem has small treewidth and the variables take on a bounded number of states, but that a fully polynomial time approximation scheme exists for these cases. Moreover, we show that the bound on the number of states is a necessary condition for any efficient approximation scheme.
Resumo:
Influence diagrams allow for intuitive and yet precise description of complex situations involving decision making under uncertainty. Unfortunately, most of the problems described by influence diagrams are hard to solve. In this paper we discuss the complexity of approximately solving influence diagrams. We do not assume no-forgetting or regularity, which makes the class of problems we address very broad. Remarkably, we show that when both the treewidth and the cardinality of the variables are bounded the problem admits a fully polynomial-time approximation scheme.
Resumo:
Background: The identification of pre-clinical microvascular damage in hypertension by non-invasive techniques has proved frustrating for clinicians. This proof of concept study investigated whether entropy, a novel summary measure for characterizing blood velocity waveforms, is altered in participants with hypertension and may therefore be useful in risk stratification.
Methods: Doppler ultrasound waveforms were obtained from the carotid and retrobulbar circulation in 42 participants with uncomplicated grade 1 hypertension (mean systolic/diastolic blood pressure (BP) 142/92 mmHg), and 26 healthy controls (mean systolic/diastolic BP 116/69 mmHg). Mean wavelet entropy was derived from flow-velocity data and compared with traditional haemodynamic measures of microvascular function, namely the resistive and pulsatility indices.
Results: Entropy, was significantly higher in control participants in the central retinal artery (CRA) (differential mean 0.11 (standard error 0.05 cms(-1)), CI 0.009 to 0.219, p 0.017) and ophthalmic artery (0.12 (0.05), CI 0.004 to 0.215, p 0.04). In comparison, the resistive index (0.12 (0.05), CI 0.005 to 0.226, p 0.029) and pulsatility index (0.96 (0.38), CI 0.19 to 1.72, p 0.015) showed significant differences between groups in the CRA alone. Regression analysis indicated that entropy was significantly influenced by age and systolic blood pressure (r values 0.4-0.6). None of the measures were significantly altered in the larger conduit vessel.
Conclusion: This is the first application of entropy to human blood velocity waveform analysis and shows that this new technique has the ability to discriminate health from early hypertensive disease, thereby promoting the early identification of cardiovascular disease in a young hypertensive population.
Resumo:
Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.
Resumo:
Purpose: Amorphous drug-polymer solid dispersions have been found to result in improved drug dissolution rates when compared to their crystalline counterparts. However, when the drug exists in the amorphous form it will possess a higher Gibb’s free energy than its associated crystalline state and can recrystallize. Drug-polymer phase diagrams constructed through the application of the Flory Huggins (F-H) theory contain a wealth of information regarding thermodynamic and kinetic stability of the amorphous drug-polymer system. This study was aimed to evaluate the effects of various experimental conditions on the solubility and miscibility detections of drug-polymer binary system. Methods: Felodipine (FD)-Polyvinylpyrrolidone (PVP) K15 (PVPK15) and FD-Polyvinylpyrrolidone/vinyl acetate (PVP/VA64) were the selected systems for this research. Physical mixtures with different drug loadings were mixed and ball milled. These samples were then processed using Differential Scanning Calorimetry (DSC) and measurements of melting point (Tend) and glass transition (Tg) were detected using heating rates of 0.5, 1.0 and 5.0°C/min. Results: The melting point depression data was then used to calculate the F-H interaction parameter (χ) and extrapolated to lower temperatures to complete the liquid–solid transition curves. The theoretical binodal and spinodal curves were also constructed which were used to identify regions within the phase diagram. The effects of polymer selection, DSC heating rate, time above parent polymer Tg and polymer molecular weight were investigated by identifying amorphous drug miscibility limits at pharmaceutically relevant temperatures. Conclusion: The potential implications of these findings when applied to a non-ambient processing method such as Hot Melt Extrusion (HME) are also discussed.