16 resultados para Regularity lemma

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between sleep apnoea–hypopnoea syndrome (SAHS) severity and the regularity of nocturnal oxygen saturation (SaO2) recordings was analysed. Three different methods were proposed to quantify regularity: approximate entropy (AEn), sample entropy (SEn) and kernel entropy (KEn). A total of 240 subjects suspected of suffering from SAHS took part in the study. They were randomly divided into a training set (96 subjects) and a test set (144 subjects) for the adjustment and assessment of the proposed methods, respectively. According to the measurements provided by AEn, SEn and KEn, higher irregularity of oximetry signals is associated with SAHS-positive patients. Receiver operating characteristic (ROC) and Pearson correlation analyses showed that KEn was the most reliable predictor of SAHS. It provided an area under the ROC curve of 0.91 in two-class classification of subjects as SAHS-negative or SAHS-positive. Moreover, KEn measurements from oximetry data exhibited a linear dependence on the apnoea–hypopnoea index, as shown by a correlation coefficient of 0.87. Therefore, these measurements could be used for the development of simplified diagnostic techniques in order to reduce the demand for polysomnographies. Furthermore, KEn represents a convincing alternative to AEn and SEn for the diagnostic analysis of noisy biomedical signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on a simple convexity lemma, we develop bounds for different types of Bayesian prediction errors for regression with Gaussian processes. The basic bounds are formulated for a fixed training set. Simpler expressions are obtained for sampling from an input distribution which equals the weight function of the covariance kernel, yielding asymptotically tight results. The results are compared with numerical experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply it directly to finite real world data sets. With this in mind, Pincus developed Approximate Entropy (ApEn), which uses ideas from Eckmann and Ruelle to create a regularity measure based on entropy rate that can be used to determine the influence of chaotic behaviour in a real world signal. However, this measure was found not to be robust and so an improved formulation known as the Sample Entropy (SampEn) was created by Richman and Moorman to address these issues. We have developed a new, related, regularity measure which is not based on the theory provided by Eckmann and Ruelle and proves a more well-behaved measure of complexity than the previous measures whilst still retaining a low computational cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of abnormal protein aggregates in the form of extracellular plaques and intracellular inclusions is a characteristic feature of many neurodegenerative diseases such as Alzheimer's disease (AD), Creutzfeldt-Jakob disease (CJD) and the fronto-temporal dementias (FTD). An important aspect of a pathological protein aggregate is its spatial topography in the tissue. Lesions may not be randomly distributed within a histological section but exhibit spatial pattern, a departure from randomness either towards regularity or clustering. Information on the spatial pattern of a lesion may be useful in elucidating its pathogenesis and in studying the relationships between different lesions. This article reviews the methods that have been used to study the spatial topography of lesions. These include simple tests of whether the distribution of a lesion departs significantly from random using randomized points or sample fields, and more complex methods that employ grids or transects of contiguous fields and which can detect the intensity of aggregation and the sizes, distribution and spacing of the clusters. The usefulness of these methods in elucidating the pathogenesis of protein aggregates in neurodegenerative disease is discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article reviews the statistical methods that have been used to study the planar distribution, and especially clustering, of objects in histological sections of brain tissue. The objective of these studies is usually quantitative description, comparison between patients or correlation between histological features. Objects of interest such as neurones, glial cells, blood vessels or pathological features such as protein deposits appear as sectional profiles in a two-dimensional section. These objects may not be randomly distributed within the section but exhibit a spatial pattern, a departure from randomness either towards regularity or clustering. The methods described include simple tests of whether the planar distribution of a histological feature departs significantly from randomness using randomized points, lines or sample fields and more complex methods that employ grids or transects of contiguous fields, and which can detect the intensity of aggregation and the sizes, distribution and spacing of clusters. The usefulness of these methods in understanding the pathogenesis of neurodegenerative diseases such as Alzheimer's disease and Creutzfeldt-Jakob disease is discussed. © 2006 The Royal Microscopical Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discrete pathological lesions, which include extracellular protein deposits, intracellular inclusions and changes in cell morphology, occur in the brain in the majority of neurodegenerative disorders. These lesions are not randomly distributed in the brain but exhibit a spatial pattern, that is, a departure from randomness towards regularity or clustering. The spatial pattern of a lesion may reflect pathological processes affecting particular neuroanatomical structures and, therefore, studies of spatial pattern may help to elucidate the pathogenesis of a lesion and of the disorders themselves. The present article reviews first, the statistical methods used to detect spatial patterns and second, the types of spatial patterns exhibited by pathological lesions in a variety of disorders which include Alzheimer's disease, Down syndrome, dementia with Lewy bodies, Creutzfeldt-Jakob disease, Pick's disease and corticobasal degeneration. These studies suggest that despite the morphological and molecular diversity of brain lesions, they often exhibit a common type of spatial pattern (i.e. aggregation into clusters that are regularly distributed in the tissue). The pathogenic implications of spatial pattern analysis are discussed with reference to the individual disorders and to studies of neurodegeneration as a whole.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Alzheimer's disease (AD) brain, beta-amyloid (Abeta) deposits and neurofibrillary tangles (NFT) are not randomly distributed but exhibit a spatial pattern, i.e., a departure from randomness towards regularity or clustering. Studies of the spatial pattern of a lesion may contribute to an understanding of its pathogenesis and therefore, of AD itself. This article describes the statistical methods most commonly used to detect the spatial patterns of brain lesions and the types of spatial patterns exhibited by ß-amyloid deposits and NFT in the cerebral cortex in AD. These studies suggest that within the cerebral cortex, Abeta deposits and NFT exhibit a similar spatial pattern, i.e., an aggregation of individual lesions into clusters which are regularly distributed parallel to the pia mater. The location, size and distribution of these clusters supports the hypothesis that AD is a 'disconnection syndrome' in which degeneration of specific cortical pathways results in the formation of clusters of NFT and Abeta deposits. In addition, a model to explain the development of the pathology within the cerebral cortex is proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stereology and other image analysis methods have enabled rapid and objective quantitative measurements to be made on histological sections. These mesurements may include total volumes, surfaces, lengths and numbers of cells and blood vessels or pathological lesions. Histological features, however, may not be randomly distributed across a section but exhibit 'dispersion', a departure from randomness either towards regularity or aggregation. Information of population dispersion may be valuable not only in understanding the two-or three-dimensional structure but also in elucidating the pathogenesis of lesions in pathological conditions. This article reviews some of the statistical methods available for studying dispersion. These range from simple tests of whether the distribution of a histological faeture departs significantly from random to more complex methods which can detect the intensity of aggregation and the sizes, distribution and spacing of the clusters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have studied the spatial distribution of plaques in coronal and tangential sections of the parahippocampal gyrus (PHG), the hippocampus, the frontal lobe and the temporal lobe of five SDAT patients. Sections were stained with cresyl violet and examined at two magnifications (x100 and x400). in all cases (and at both magnifications) statistical analysis using the Poisson distribution showed that the plaques were arranged in clumps (x100: V/M = 1.48 - 4.49; x400 V/M = 1.17 - 1.95). this indicates that both large scale and small scale clumping occurs. Application of the statistical techniques of pattern analysis to coronal sections of frontal and temporal cortex and PHG showed. furthermore, that both large (3200-6400 micron) and small scale (100 - 400 micron) clumps were arranged with a high degree of regularity in the tissue. This suggests that the clumps of plaques reflect underlying neural structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been much recent research into extracting useful diagnostic features from the electrocardiogram with numerous studies claiming impressive results. However, the robustness and consistency of the methods employed in these studies is rarely, if ever, mentioned. Hence, we propose two new methods; a biologically motivated time series derived from consecutive P-wave durations, and a mathematically motivated regularity measure. We investigate the robustness of these two methods when compared with current corresponding methods. We find that the new time series performs admirably as a compliment to the current method and the new regularity measure consistently outperforms the current measure in numerous tests on real and synthetic data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present thesis tested the hypothesis of Stanovich, Siegel, & Gottardo (1997) that surface dyslexia is the result of a milder phonological deficit than that seen in phonological dyslexia coupled with reduced reading experience. We found that a group of adults with surface dyslexia showed a phonological deficit that was commensurate with that shown by a group of adults with phonological dyslexia (matched for chronological age and verbal and non-verbal IQ) and normal reading experience. We also showed that surface dyslexia cannot be accounted for by a semantic impairment or a deficit in the verbal learning and recall of lexical-semantic information (such as meaningful words), as both dyslexic subgroups performed the same. This study has replicated the results of our published study that surface dyslexia is not the consequence of a mild retardation or reduced learning opportunities but a separate impairment linked to a deficit in written lexical learning, an ability needed to create novel lexical representations from a series of unrelated visual units, which is independent from the phonological deficit (Romani, Di Betta, Tsouknida & Olson, 2008). This thesis also provided evidence that a selective nonword reading deficit in developmental dyslexia persists beyond poor phonology. This was shown by finding a nonword reading deficit even in the presence of normal regularity effects in the dyslexics (when compared to both reading and spelling-age matched controls). A nonword reading deficit was also found in the surface dyslexics. Crucially, this deficit was as strong as in the phonological dyslexics despite better functioning of the sublexical route for the former. These results suggest that a nonword reading deficit cannot be solely explained by a phonological impairment. We, thus, suggested that nonword reading should also involve another ability relating to the processing of novel visual orthographic strings, which we called 'orthographic coding'. We then investigated the ability to process series of independent units within multi-element visual arrays and its relationship with reading and spelling problems. We identified a deficit in encoding the order of visual sequences (involving both linguistic and nonlinguistic information) which was significantly associated with word and nonword processing. More importantly, we revealed significant contributions to orthographic skills in both dyslexic and control individuals, even after age, performance IQ and phonological skills were controlled. These results suggest that spelling and reading do not only tap phonological skills but also order encoding skills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examined the spelling acquisition in children up to late primary school of a consistent orthography (Italian) and an inconsistent orthography (English). The effects of frequency, lexicality, length, and regularity in modulating spelling performance of the two groups were examined. English and Italian children were matched for both chronological age and number of years of schooling. Two-hundred and seven Italian children and 79 English children took part in the study. We found greater accuracy in spelling in Italian than English children: Italian children were very accurate after only 2 years of schooling, while in English children the spelling performance was still poor after 5 years of schooling. Cross-linguistic differences in spelling accuracy proved to be more persistent than the corresponding ones in reading accuracy. Orthographic consistency produced not only quantitative, but also qualitative differences, with larger frequency and regularity effects in English than in Italian children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An iterative procedure for determining temperature fields from Cauchy data given on a part of the boundary is presented. At each iteration step, a series of mixed well-posed boundary value problems are solved for the heat operator and its adjoint. A convergence proof of this method in a weighted L2-space is included, as well as a stopping criteria for the case of noisy data. Moreover, a solvability result in a weighted Sobolev space for a parabolic initial boundary value problem of second order with mixed boundary conditions is presented. Regularity of the solution is proved. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The long-term foetal surveillance is often to be recommended. Hence, the fully non-invasive acoustic recording, through maternal abdomen, represents a valuable alternative to the ultrasonic cardiotocography. Unfortunately, the recorded heart sound signal is heavily loaded by noise, thus the determination of the foetal heart rate raises serious signal processing issues. In this paper, we present a new algorithm for foetal heart rate estimation from foetal phonocardiographic recordings. A filtering is employed as a first step of the algorithm to reduce the background noise. A block for first heart sounds enhancing is then used to further reduce other components of foetal heart sound signals. A complex logic block, guided by a number of rules concerning foetal heart beat regularity, is proposed as a successive block, for the detection of most probable first heart sounds from several candidates. A final block is used for exact first heart sound timing and in turn foetal heart rate estimation. Filtering and enhancing blocks are actually implemented by means of different techniques, so that different processing paths are proposed. Furthermore, a reliability index is introduced to quantify the consistency of the estimated foetal heart rate and, based on statistic parameters; [,] a software quality index is designed to indicate the most reliable analysis procedure (that is, combining the best processing path and the most accurate time mark of the first heart sound, provides the lowest estimation errors). The algorithm performances have been tested on phonocardiographic signals recorded in a local gynaecology private practice from a sample group of about 50 pregnant women. Phonocardiographic signals have been recorded simultaneously to ultrasonic cardiotocographic signals in order to compare the two foetal heart rate series (the one estimated by our algorithm and the other provided by cardiotocographic device). Our results show that the proposed algorithm, in particular some analysis procedures, provides reliable foetal heart rate signals, very close to the reference cardiotocographic recordings. © 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The logic of ‘time’ in modern capitalist society appears to be a fixed concept. Time dictates human activity with a regularity, which as long ago as 1944, George Woodcock referred to as The Tyranny of the Clock. Seventy years on, Hartmut Rosa suggests humans no longer maintain speed to achieve something new, but simply to preserve the status quo, in a ‘social acceleration’ that is lethal to democracy. Political engagement takes time we no longer have, as we rush between our virtual spaces and ‘non-places’ of higher education. I suggest it’s time to confront the conspirators that, in partnership with the clock, accelerate our social engagements with technology in the context of learning. Through Critical Discourse Analysis (CDA) I reveal an alarming situation if we don’t. With reference to Bauman’s Liquid Modernity, I observe a ‘lightness’ in policy texts where humans have been ‘liquified’ Separating people from their own labour with technology in policy maintains the flow of speed a neoliberal economy demands. I suggest a new ‘solidity’ of human presence is required as we write about networked learning. ‘Writing ourselves back in’ requires a commitment to ‘be there’ in policy and provide arguments that decelerate the tyranny of time. I am though ever-mindful that social acceleration is also of our own making, and there is every possibility that we actually enjoy it.