14 resultados para Divergence time estimates
em Aston University Research Archive
Resumo:
Shropshire Energy Team initiated this study to examine consumption and associated emissions in the predominantly rural county of Shropshire. Current use of energy is not sustainable in the long term and there are various approaches to dealing with the environmental problems it creates. Energy planning by a local authority for a sustainable future requires detailed energy consumption and environmental information. This information would enable target setting and the implementation of policies designed to encourage energy efficiency improvements and exploitation of renewable energy resources. This could aid regeneration strategies by providing new employment opportunities. Associated reductions in carbon dioxide and other emissions would help to meet national and international environmental targets. In the absence of this detailed information, the objective was to develop a methodology to assess energy consumption and emissions on a regional basis from 1990 onwards for all local planning authorities. This would enable a more accurate assessment of the relevant issues, such that plans are more appropriate and longer lasting. A first comprehensive set of data has been gathered from a wide range of sources and a strong correlation was found between population and energy consumption for a variety of regions across the UK. In this case the methodology was applied to the county of Shropshire to give, for the first time, estimates of primary fuel consumption, electricity consumption and associated emissions in Shropshire for 1990 to 2025. The estimates provide a suitable baseline for assessing the potential contribution renewable energy could play in meeting electricity demand in the country and in reducing emissions. The assessment indicated that in 1990 total primary fuel consumption was 63,518,018 GJ/y increasing to 119,956,465 GJ/y by 2025. This is associated with emissions of 1,129,626 t/y of carbon in 1990 rising to 1,303,282 t/y by 2025. In 1990, 22,565,713 GJ/y of the primary fuel consumption was used for generating electricity rising to 23,478,050 GJ/y in 2025. If targets to reduce primary fuel consumption are reached, then emissions of carbon would fall to 1,042,626 by 2025, if renewable energy targets were also reached then emissions of carbon would fall to 988,638 t/y by 2025.
Resumo:
In this paper we propose a quantum algorithm to measure the similarity between a pair of unattributed graphs. We design an experiment where the two graphs are merged by establishing a complete set of connections between their nodes and the resulting structure is probed through the evolution of continuous-time quantum walks. In order to analyze the behavior of the walks without causing wave function collapse, we base our analysis on the recently introduced quantum Jensen-Shannon divergence. In particular, we show that the divergence between the evolution of two suitably initialized quantum walks over this structure is maximum when the original pair of graphs is isomorphic. We also prove that under special conditions the divergence is minimum when the sets of eigenvalues of the Hamiltonians associated with the two original graphs have an empty intersection.
Resumo:
Computer simulated trajectories of bulk water molecules form complex spatiotemporal structures at the picosecond time scale. This intrinsic complexity, which underlies the formation of molecular structures at longer time scales, has been quantified using a measure of statistical complexity. The method estimates the information contained in the molecular trajectory by detecting and quantifying temporal patterns present in the simulated data (velocity time series). Two types of temporal patterns are found. The first, defined by the short-time correlations corresponding to the velocity autocorrelation decay times (â‰0.1â€ps), remains asymptotically stable for time intervals longer than several tens of nanoseconds. The second is caused by previously unknown longer-time correlations (found at longer than the nanoseconds time scales) leading to a value of statistical complexity that slowly increases with time. A direct measure based on the notion of statistical complexity that describes how the trajectory explores the phase space and independent from the particular molecular signal used as the observed time series is introduced. © 2008 The American Physical Society.
Resumo:
When making predictions with complex simulators it can be important to quantify the various sources of uncertainty. Errors in the structural specification of the simulator, for example due to missing processes or incorrect mathematical specification, can be a major source of uncertainty, but are often ignored. We introduce a methodology for inferring the discrepancy between the simulator and the system in discrete-time dynamical simulators. We assume a structural form for the discrepancy function, and show how to infer the maximum-likelihood parameter estimates using a particle filter embedded within a Monte Carlo expectation maximization (MCEM) algorithm. We illustrate the method on a conceptual rainfall-runoff simulator (logSPM) used to model the Abercrombie catchment in Australia. We assess the simulator and discrepancy model on the basis of their predictive performance using proper scoring rules. This article has supplementary material online. © 2011 International Biometric Society.
Resumo:
In this thesis the results of experimental work performed to determine local heat transfer coefficients for non-Newtonian fluids in laminar flow through pipes with abrupt discontinuities are reported. The fluids investigated were water-based polymeric solutiorrs of time-indpendent, pseudoplastic materials, with flow indices "n" ranging from 0.39 to 0.9.The tube configurations were a 3.3 :1 sudden convergence, and a 1: 3.3 sudden divergence.The condition of a prescribed uniform wall heat flux was considered, with both upstream and downstream tube sections heated. Radial temperature traverses were also under taken primarily to justify the procedures used in estimating the tube wall and bulk fluid temperatures and secondly to give further insight into the mechanism of heat transfer beyond a sudden tube expansion. A theoretical assessment of the influence of viscous dissipation on a non-Newtonian pseudoplastic fluid of' arbitrary index "n" was carried out. The effects of other secondary factors such as free convection and temperature-dependent consistency were evaluated empirically. In the present investigations, the test conditions were chosen to minimise the effects of natural convection and the estimates of viscous heat generation showed the effect to be insignificant with the polymeric concentrations tested here. The final results have been presented as the relationships between local heat transfer coef'ficient and axial distance downstream of the discontinuities and relationships between dimensionless wall temperature and reduced radius. The influence of Reynolds number, Prandtl number, non-Newtonian index and heat flux have been indicated.
Resumo:
Congenital nystagmus (CN) is an ocular-motor disorder characterised by involuntary, conjugated ocular oscillations and its pathogenesis is still under investigation. This kind of nystagmus is termed congenital (or infantile) since it could be present at birth or it can arise in the first months of life. Most of CN patients show a considerable decrease of their visual acuity: image fixation on the retina is disturbed by nystagmus continuous oscillations, mainly horizontal. However, the image of a given target can still be stable during short periods in which eye velocity slows down while the target image is placed onto the fovea (called foveation intervals). To quantify the extent of nystagmus, eye movement recording are routinely employed, allowing physicians to extract and analyse nystagmus main features such as waveform shape, amplitude and frequency. Using eye movement recording, it is also possible to compute estimated visual acuity predictors: analytical functions which estimates expected visual acuity using signal features such as foveation time and foveation position variability. Use of those functions extend the information from typical visual acuity measurement (e.g. Landolt C test) and could be a support for therapy planning or monitoring. This study focuses on detection of CN patients' waveform type and on foveation time measure. Specifically, it proposes a robust method to recognize cycles corresponding to the specific CN waveform in the eye movement pattern and, for those cycles, evaluate the exact signal tracts in which a subject foveates. About 40 eyemovement recordings, either infrared-oculographic or electrooculographic, were acquired from 16 CN subjects. Results suggest that the use of an adaptive threshold applied to the eye velocity signal could improve the estimation of slow phase start point. This can enhance foveation time computing and reduce influence of repositioning saccades and data noise on the waveform type identification.
Foveation time measure in Congenital Nystagmus through second order approximation of the slow phases
Resumo:
Congenital Nystagmus (CN) is an ocular-motor disorder characterised by involuntary, conjugated ocular oscillations, and its pathogenesis is still unknown. The pathology is de fined as "congenital" from the onset time of its arise which could be at birth or in the first months of life. Visual acuity in CN subjects is often diminished due to nystagmus continuous oscillations, mainly on the horizontal plane, which disturb image fixation on the retina. However, during short periods in which eye velocity slows down while the target image is placed onto the fovea (called foveation intervals) the image of a given target can still be stable, allowing a subject to reach a higher visual acuity. In CN subjects, visual acuity is usually assessed both using typical measurement techniques (e.g. Landolt C test) and with eye movement recording in different gaze positions. The offline study of eye movement recordings allows physicians to analyse nystagmus main features such as waveform shape, amplitude and frequency and to compute estimated visual acuity predictors. This analytical functions estimates the best corrected visual acuity using foveation time and foveation position variability, hence a reliable estimation of this two parameters is a fundamental factor in assessing visual acuity. This work aims to enhance the foveation time estimation in CN eye movement recording, computing a second order approximation of the slow phase components of nystag-mus oscillations. About 19 infraredoculographic eye-movement recordings from 10 CN subjects were acquired and the visual acuity assessed with an acuity predictor was compared to the one measured in primary position. Results suggest that visual acuity measurements based on foveation time estimation obtained from interpolated data are closer to value obtained during Landolt C tests. © 2010 IEEE.
Resumo:
In this paper, we develop a new graph kernel by using the quantum Jensen-Shannon divergence and the discrete-time quantum walk. To this end, we commence by performing a discrete-time quantum walk to compute a density matrix over each graph being compared. For a pair of graphs, we compare the mixed quantum states represented by their density matrices using the quantum Jensen-Shannon divergence. With the density matrices for a pair of graphs to hand, the quantum graph kernel between the pair of graphs is defined by exponentiating the negative quantum Jensen-Shannon divergence between the graph density matrices. We evaluate the performance of our kernel on several standard graph datasets, and demonstrate the effectiveness of the new kernel.
Resumo:
The study of complex networks has recently attracted increasing interest because of the large variety of systems that can be modeled using graphs. A fundamental operation in the analysis of complex networks is that of measuring the centrality of a vertex. In this paper, we propose to measure vertex centrality using a continuous-time quantum walk. More specifically, we relate the importance of a vertex to the influence that its initial phase has on the interference patterns that emerge during the quantum walk evolution. To this end, we make use of the quantum Jensen-Shannon divergence between two suitably defined quantum states. We investigate how the importance varies as we change the initial state of the walk and the Hamiltonian of the system. We find that, for a suitable combination of the two, the importance of a vertex is almost linearly correlated with its degree. Finally, we evaluate the proposed measure on two commonly used networks. © 2014 Springer-Verlag Berlin Heidelberg.
Resumo:
In this paper, we use the quantum Jensen-Shannon divergence as a means to establish the similarity between a pair of graphs and to develop a novel graph kernel. In quantum theory, the quantum Jensen-Shannon divergence is defined as a distance measure between quantum states. In order to compute the quantum Jensen-Shannon divergence between a pair of graphs, we first need to associate a density operator with each of them. Hence, we decide to simulate the evolution of a continuous-time quantum walk on each graph and we propose a way to associate a suitable quantum state with it. With the density operator of this quantum state to hand, the graph kernel is defined as a function of the quantum Jensen-Shannon divergence between the graph density operators. We evaluate the performance of our kernel on several standard graph datasets from bioinformatics. We use the Principle Component Analysis (PCA) on the kernel matrix to embed the graphs into a feature space for classification. The experimental results demonstrate the effectiveness of the proposed approach. © 2013 Springer-Verlag.
Resumo:
Kernel methods provide a way to apply a wide range of learning techniques to complex and structured data by shifting the representational problem from one of finding an embedding of the data to that of defining a positive semidefinite kernel. In this paper, we propose a novel kernel on unattributed graphs where the structure is characterized through the evolution of a continuous-time quantum walk. More precisely, given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic. With this new graph to hand, we compute the density operators of the quantum systems representing the evolutions of two suitably defined quantum walks. Finally, we define the kernel between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators. The experimental evaluation shows the effectiveness of the proposed approach. © 2013 Springer-Verlag.
Resumo:
One of the most fundamental problem that we face in the graph domain is that of establishing the similarity, or alternatively the distance, between graphs. In this paper, we address the problem of measuring the similarity between attributed graphs. In particular, we propose a novel way to measure the similarity through the evolution of a continuous-time quantum walk. Given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic, and where a subset of the edges is labeled with the similarity between the respective nodes. With this compositional structure to hand, we compute the density operators of the quantum systems representing the evolution of two suitably defined quantum walks. We define the similarity between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators, and then we show how to build a novel kernel on attributed graphs based on the proposed similarity measure. We perform an extensive experimental evaluation both on synthetic and real-world data, which shows the effectiveness the proposed approach. © 2013 Springer-Verlag.
Resumo:
The aim of this review was to quantify the global variation in childhood myopia prevalence over time taking account of demographic and study design factors. A systematic review identified population-based surveys with estimates of childhood myopia prevalence published by February 2015. Multilevel binomial logistic regression of log odds of myopia was used to examine the association with age, gender, urban versus rural setting and survey year, among populations of different ethnic origins, adjusting for study design factors. 143 published articles (42 countries, 374 349 subjects aged 1- 18 years, 74 847 myopia cases) were included. Increase in myopia prevalence with age varied by ethnicity. East Asians showed the highest prevalence, reaching 69% (95% credible intervals (CrI) 61% to 77%) at 15 years of age (86% among Singaporean-Chinese). Blacks in Africa had the lowest prevalence; 5.5% at 15 years (95% CrI 3% to 9%). Time trends in myopia prevalence over the last decade were small in whites, increased by 23% in East Asians, with a weaker increase among South Asians. Children from urban environments have 2.6 times the odds of myopia compared with those from rural environments. In whites and East Asians sex differences emerge at about 9 years of age; by late adolescence girls are twice as likely as boys to be myopic. Marked ethnic differences in age-specific prevalence of myopia exist. Rapid increases in myopia prevalence over time, particularly in East Asians, combined with a universally higher risk of myopia in urban settings, suggest that environmental factors play an important role in myopia development, which may offer scope for prevention.
Resumo:
In this paper, we develop a new family of graph kernels where the graph structure is probed by means of a discrete-time quantum walk. Given a pair of graphs, we let a quantum walk evolve on each graph and compute a density matrix with each walk. With the density matrices for the pair of graphs to hand, the kernel between the graphs is defined as the negative exponential of the quantum Jensen–Shannon divergence between their density matrices. In order to cope with large graph structures, we propose to construct a sparser version of the original graphs using the simplification method introduced in Qiu and Hancock (2007). To this end, we compute the minimum spanning tree over the commute time matrix of a graph. This spanning tree representation minimizes the number of edges of the original graph while preserving most of its structural information. The kernel between two graphs is then computed on their respective minimum spanning trees. We evaluate the performance of the proposed kernels on several standard graph datasets and we demonstrate their effectiveness and efficiency.