924 resultados para TSALLIS ENTROPY
Resumo:
Background and aims Evaluating status in patients with motor fluctuations is complex and occasional observations/measurements do not give an adequate picture as to the time spent in different states. We developed a test battery to assess advanced Parkinson patients' status consisting of diary assessments and motor tests. This battery was constructed and implemented on a handheld computer with built-in mobile communication. In fluctuating patients, it should typically be used several times daily in the home environment, over periods of about one week. The aim of this battery is to provide status information in order to evaluate treatment effects in clinical practice and research, follow up treatments and disease progression and predict outcome to optimize treatment strategy. Methods Selection of diary questions was based on a previous study with Duodopa® (DIREQT). Tapping tests (with and without visual cueing) and a spiral drawing test were added. Rapid prototyping was used in development of the user interface. An evaluation with two pilot patients was performed before and after receiving new treatments for advanced disease (one received Duodopa® and one received DBS). Speed and proportion missed taps were calculated for the tapping tests and entropy of the radial drawing velocity was calculated for the spiral tests. Test variables were evaluated using non-parametric statistics. Results Post-treatment improvement was detected in both patients in many of the test variables. Conclusions Although validation work remains, preliminary results are promising and the test battery is currently being evaluated in a long-term health economics study with Duodopa® (DAPHNE).
Resumo:
Objective: To investigate whether spirography-based objective measures are able to effectively characterize the severity of unwanted symptom states (Off and dyskinesia) and discriminate them from motor state of healthy elderly subjects. Background: Sixty-five patients with advanced Parkinson’s disease (PD) and 10 healthy elderly (HE) subjects performed repeated assessments of spirography, using a touch screen telemetry device in their home environments. On inclusion, the patients were either treated with levodopa-carbidopa intestinal gel or were candidates for switching to this treatment. On each test occasion, the subjects were asked trace a pre-drawn Archimedes spiral shown on the screen, using an ergonomic pen stylus. The test was repeated three times and was performed using dominant hand. A clinician used a web interface which animated the spiral drawings, allowing him to observe different kinematic features, like accelerations and spatial changes, during the drawing process and to rate different motor impairments. Initially, the motor impairments of drawing speed, irregularity and hesitation were rated on a 0 (normal) to 4 (extremely severe) scales followed by marking the momentary motor state of the patient into 2 categories that is Off and Dyskinesia. A sample of spirals drawn by HE subjects was randomly selected and used in subsequent analysis. Methods: The raw spiral data, consisting of stylus position and timestamp, were processed using time series analysis techniques like discrete wavelet transform, approximate entropy and dynamic time warping in order to extract 13 quantitative measures for representing meaningful motor impairment information. A principal component analysis (PCA) was used to reduce the dimensions of the quantitative measures into 4 principal components (PC). In order to classify the motor states into 3 categories that is Off, HE and dyskinesia, a logistic regression model was used as a classifier to map the 4 PCs to the corresponding clinically assigned motor state categories. A stratified 10-fold cross-validation (also known as rotation estimation) was applied to assess the generalization ability of the logistic regression classifier to future independent data sets. To investigate mean differences of the 4 PCs across the three categories, a one-way ANOVA test followed by Tukey multiple comparisons was used. Results: The agreements between computed and clinician ratings were very good with a weighted area under the receiver operating characteristic curve (AUC) coefficient of 0.91. The mean PC scores were different across the three motor state categories, only at different levels. The first 2 PCs were good at discriminating between the motor states whereas the PC3 was good at discriminating between HE subjects and PD patients. The mean scores of PC4 showed a trend across the three states but without significant differences. The Spearman’s rank correlations between the first 2 PCs and clinically assessed motor impairments were as follows: drawing speed (PC1, 0.34; PC2, 0.83), irregularity (PC1, 0.17; PC2, 0.17), and hesitation (PC1, 0.27; PC2, 0.77). Conclusions: These findings suggest that spirography-based objective measures are valid measures of spatial- and time-dependent deficits and can be used to distinguish drug-related motor dysfunctions between Off and dyskinesia in PD. These measures can be potentially useful during clinical evaluation of individualized drug-related complications such as over- and under-medications thus maximizing the amount of time the patients spend in the On state.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
Consideration of a wide range of plausible crime scenarios during any crime investigation is important to seek convincing evidence and hence to minimize the likelihood of miscarriages of justice. It is equally important for crime investigators to be able to employ effective and efficient evidence-collection strategies that are likely to produce the most conclusive information under limited available resources. An intelligent decision support system that can assist human investigators by automatically constructing plausible scenarios, and reasoning with the likely best investigating actions will clearly be very helpful in addressing these challenging problems. This paper presents a system for creating scenario spaces from given evidence, based on an integrated application of techniques for compositional modelling and Bayesian network-based evidence evaluation. Methods of analysis are also provided by the use of entropy to exploit the synthesized scenario spaces in order to prioritize investigating actions and hypotheses. These theoretical developments are illustrated by realistic examples of serious crime investigation.
Resumo:
The paper investigates which of Shannon’s measures (entropy, conditional entropy, mutual information) is the right one for the task of quantifying information flow in a programming language. We examine earlier relevant contributions from Denning, McLean and Gray and we propose and motivate a specific quantitative definition of information flow. We prove results relating equivalence relations, interference of program variables, independence of random variables and the flow of confidential information. Finally, we show how, in our setting, Shannon’s Perfect Secrecy theorem provides a sufficient condition to determine whether a program leaks confidential information.
Resumo:
Estado e sociedade brasileiros conviveram em descompasso, nos anos 80. A conseqüência imediata desse fenômeno foi o atendimento insuficiente de necessidades básicas da sociedade, nesse período, com aumento da entropia em vários subsistemas sociais brasileiros, dentre os quais o subsistema de saúde. Nesta tese, trabalhando com dados econômicos, sociais e de saúde, e construindo algumas variáveis-indicadores, confrontou-se, naquele período, necessidades da sociedade com ações do Estado, na área da saúde. Utilizando técnicas estatísticas - análise gráfica, associação estatística dos indicadores selecionados (matriz de correlação de PEARSON), análise em componentes principais, análise de agrupamento e análise de regressão linear múltipla com variáveis logaritímizadas - foi possível visualizar causas e conseqüências dessa alta entropia, caracterizada por desperdício de recursos e várias situações propensas à geração de crises nas organizações, setores e instituições do subsistema de saúde brasileiro. Propõe-se um método de alocação de recursos federais, objetivando minimizar desigualdades entre as Unidades da Federação, a partir de seus desempenhos na área de saúde.
Resumo:
Nesta Dissertações discutimos - com base na epistemologia de Bachelard - os problemas mais gerais que cercam atualmente o ensino das ciências físicas (química e física) no segundo grau. Tivemos por meta examinar o papel cumprido pelas diversas correntes filosóficas que, a partir da modernidade, influenciaram amplamente a cultura científica, destacando as consequências mais significativas para o ensino da Química e da física. Para melhor compreender esses reflexos, fizemos também uma pesquisa de campo junto aos professores de química e física Que lecionam no município do Rio de Janeiro. A partir daí foi possível evidenciar a situação de fragilidade do ensino secundário de tais ciências e propor alternativas para a modificação desse Quadro tão negativo.
Resumo:
O aumento da complexidade do mercado financeiro tem sido relatado por Rajan (2005), Gorton (2008) e Haldane e May (2011) como um dos principais fatores responsáveis pelo incremento do risco sistêmico que culminou na crise financeira de 2007/08. O Bank for International Settlements (2013) aborda a questão da complexidade no contexto da regulação bancária e discute a comparabilidade da adequação de capital entre os bancos e entre jurisdições. No entanto, as definições dos conceitos de complexidade e de sistemas adaptativos complexos são suprimidas das principais discussões. Este artigo esclarece alguns conceitos relacionados às teorias da Complexidade, como se dá a emergência deste fenômeno, como os conceitos podem ser aplicados ao mercado financeiro. São discutidas duas ferramentas que podem ser utilizadas no contexto de sistemas adaptativos complexos: Agent Based Models (ABMs) e entropia e comparadas com ferramentas tradicionais. Concluímos que ainda que a linha de pesquisa da complexidade deixe lacunas, certamente esta contribui com a agenda de pesquisa econômica para se compreender os mecanismos que desencadeiam riscos sistêmicos, bem como adiciona ferramentas que possibilitam modelar agentes heterogêneos que interagem, de forma a permitir o surgimento de fenômenos emergentes no sistema. Hipóteses de pesquisa são sugeridas para aprofundamento posterior.
Resumo:
This dissertation presents two papers on how to deal with simple systemic risk measures to assess portfolio risk characteristics. The first paper deals with the Granger-causation of systemic risk indicators based in correlation matrices in stock returns. Special focus is devoted to the Eigenvalue Entropy as some previous literature indicated strong re- sults, but not considering different macroeconomic scenarios; the Index Cohesion Force and the Absorption Ratio are also considered. Considering the S&P500, there is not ev- idence of Granger-causation from Eigenvalue Entropies and the Index Cohesion Force. The Absorption Ratio Granger-caused both the S&P500 and the VIX index, being the only simple measure that passed this test. The second paper develops this measure to capture the regimes underlying the American stock market. New indicators are built using filtering and random matrix theory. The returns of the S&P500 is modelled as a mixture of normal distributions. The activation of each normal distribution is governed by a Markov chain with the transition probabilities being a function of the indicators. The model shows that using a Herfindahl-Hirschman Index of the normalized eigenval- ues exhibits best fit to the returns from 1998-2013.
Resumo:
We aim to provide a review of the stochastic discount factor bounds usually applied to diagnose asset pricing models. In particular, we mainly discuss the bounds used to analyze the disaster model of Barro (2006). Our attention is focused in this disaster model since the stochastic discount factor bounds that are applied to study the performance of disaster models usually consider the approach of Barro (2006). We first present the entropy bounds that provide a diagnosis of the analyzed disaster model which are the methods of Almeida and Garcia (2012, 2016); Ghosh et al. (2016). Then, we discuss how their results according to the disaster model are related to each other and also present the findings of other methodologies that are similar to these bounds but provide different evidence about the performance of the framework developed by Barro (2006).
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The rational construction necessary to systematize scientific knowledge in physics, introduces difficulties of understanding in some of its concepts. One of these concepts which exemplify properly this difficulty in learning or teaching is entropy. This thesis propose the construction of a didactic route which constitute itself a historical and epistemological course to entropy, intending to contribute for teaching this concept as well as other physics concepts. The basic assumption to build this route is that through the historical review of the development of this concept in the way suggested by Bachelard s (1884-1962) epistemology it is possible to make subjects, to be taught and learned, more meaningful. Initially I composed a brief biographical note to give the reader an idea about the issues, interests and reflections, related to science, and how I dealt with them in my private and professional life, as well as the role they played to lead me to write this thesis. The strategy to construct the route to entropy was to split the usual contents of basic thermodynamics in three moments in a way they can constitute epistemological units , which can be identified by the way of thinking in the corresponding moments of scientific knowledge production: a technical and empiricist moment, a rationalist and positivist moment and a post-positivist rationalist one. The transition between each moment is characterized by a rupture with the former way of thinking; however the progress in the construction of knowledge in the area is evident. As the final part of this work I present an analysis based on elements of Bachelard s epistemology that are present in each moment. This analysis is the basic component of the didactic route that I propose myself to build. The way I made this route guide to entropy could contribute to the construction of other didactic routes in physics and other sciences, in a way to unveil hidden meanings and as a tool to humanize scientific knowledge.
Resumo:
Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
Discussions about pollution caused by vehicles emission are old and have been developed along the years. The search for cleaner technologies and frequent weather alterations have been inducing industries and government organizations to impose limits much more rigorous to the contaminant content in fuels, which have an direct impact in atmospheric emissions. Nowadays, the quality of fuels, in relation to the sulfur content, is carried out through the process of hydrodesulfurization. Adsorption processes also represent an interesting alternative route to the removal of sulfur content. Both processes are simpler and operate to atmospheric temperatures and pressures. This work studies the synthesis and characterization of aluminophosphate impregnate with zinc, molybdenum or both, and its application in the sulfur removal from the gasoline through the adsorption process, using a pattern gasoline containing isooctane and thiophene. The adsorbents were characterized by x-ray diffraction, differential thermal analysis (DTG), x-ray fluorescence and scanning electron microscopy (SEM). The specific area, volume and pore diameter were determined by BET (Brunauer- Emmet-Teller) and the t-plot method. The sulfur was quantified by elementary analysis using ANTEK 9000 NS. The adsorption process was evaluated as function of the temperature variation and initial sulfur content through the adsorption isotherm and its thermodynamic parameters. The parameters of entropy (ΔS), enthalpy variation (ΔH) and free Gibbs energy (ΔG) were calculated through the graph ln(Kd) versus 1/T. Langmuir, Freundlich and Langmuir-Freundlich models were adjusted to the experimental data, and the last one had presented better results. The thermodynamic tests were accomplished in different temperatures, such as 30, 40 and 50ºC, where it was concluded the adsorption process is spontaneous and exothermic. The kinetic of adsorption was studied by 24 h and it showed that the capability adsorption to the adsorbents studied respect the following order: MoZnPO > MoPO > ZnPO > AlPO. The maximum adsorption capacity was 4.91 mg/g for MoZnPO with an adsorption efficiency of 49%.