8 resultados para time-variant reliability

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work compares and contrasts results of classifying time-domain ECG signals with pathological conditions taken from the MITBIH arrhythmia database. Linear discriminant analysis and a multi-layer perceptron were used as classifiers. The neural network was trained by two different methods, namely back-propagation and a genetic algorithm. Converting the time-domain signal into the wavelet domain reduced the dimensionality of the problem at least 10-fold. This was achieved using wavelets from the db6 family as well as using adaptive wavelets generated using two different strategies. The wavelet transforms used in this study were limited to two decomposition levels. A neural network with evolved weights proved to be the best classifier with a maximum of 99.6% accuracy when optimised wavelet-transform ECG data wits presented to its input and 95.9% accuracy when the signals presented to its input were decomposed using db6 wavelets. The linear discriminant analysis achieved a maximum classification accuracy of 95.7% when presented with optimised and 95.5% with db6 wavelet coefficients. It is shown that the much simpler signal representation of a few wavelet coefficients obtained through an optimised discrete wavelet transform facilitates the classification of non-stationary time-variant signals task considerably. In addition, the results indicate that wavelet optimisation may improve the classification ability of a neural network. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The electrochemistry of Pt nanostructured electrodes is investigated using hydrodynamic modulated voltammetry (HMV). Here a liquid crystal templating process is used to produce platinum-modified electrodes with a range of surface areas (roughness factor 42.4-280.8). The electroreduction of molecular oxygen at these nanostructured platinum surfaces is used to demonstrate the ability of HMV to discriminate between faradaic and nonfaradaic electrode reactions. The HMV approach shows that the reduction of molecular oxygen experiences considerable signal loss within the high pseudocapacitive region of the voltammetry. Evidence for the contribution of the double layer to transient mass transfer events is presented. In addition, a model circuit and appropriate theoretical analysis are used to illustrate the transient responses of a time variant faradaic component. This in conjunction with the experimental evidence shows that, far from being a passive component in this system, the double layer can contribute to HMV faradaic reactions under certain conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a new blind equalisation algorithm for the pulse amplitude modulation (PAM) data transmitted through nonminimum phase (NMP) channels. The algorithm itself is based on a noncausal AR model of communication channels and the second- and fourth-order cumulants of the received data series, where only the diagonal slices of cumulants are used. The AR parameters are adjusted at each sample by using a successive over-relaxation (SOR) scheme, a variety of the ordinary LMS scheme, but with a faster convergence rate and a greater robustness to the selection of the ‘step-size’ in iterations. Computer simulations are implemented for both linear time-invariant (LTI) and linear time-variant (LTV) NMP channels, and the results show that the algorithm proposed in this paper has a fast convergence rate and a potential capability to track the LTV NMP channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to apply and compare two time-domain analysis procedures in the determination of oxygen uptake (VO2) kinetics in response to a pseudorandom binary sequence (PRBS) exercise test. PRBS exercise tests have typically been analysed in the frequency domain. However, the complex interpretation of frequency responses may have limited the application of this procedure in both sporting and clinical contexts, where a single time measurement would facilitate subject comparison. The relative potential of both a mean response time (MRT) and a peak cross-correlation time (PCCT) was investigated. This study was divided into two parts: a test-retest reliability study (part A), in which 10 healthy male subjects completed two identical PRBS exercise tests, and a comparison of the VO2 kinetics of 12 elite endurance runners (ER) and 12 elite sprinters (SR; part B). In part A, 95% limits of agreement were calculated for comparison between MRT and PCCT. The results of part A showed no significant difference between test and retest as assessed by MRT [mean (SD) 42.2 (4.2) s and 43.8 (6.9) s] or by PCCT [21.8 (3.7) s and 22.7 (4.5) s]. Measurement error (%) was lower for MRT in comparison with PCCT (16% and 25%, respectively). In part B of the study, the VO2 kinetics of ER were significantly faster than those of SR, as assessed by MRT [33.4 (3.4) s and 39.9 (7.1) s, respectively; P<0.01] and PCCT [20.9 (3.8) s and 24.8 (4.5) s; P < 0.05]. It is possible that either analysis procedure could provide a single test measurement Of VO2 kinetics; however, the greater reliability of the MRT data suggests that this method has more potential for development in the assessment Of VO2 kinetics by PRBS exercise testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Due to the heterogeneity in the biological behavior of prostate cancer, biomarkers that can reliably distinguish indolent from aggressive disease are urgently needed to inform treatment choices. METHODS: We employed 8-plex isobaric Tags for Relative and Absolute Quantitation (iTRAQ), to profile the proteomes of two distinct panels of isogenic prostate cancer cells with varying growth and metastatic potentials, in order to identify novel biomarkers associated with progression. The LNCaP, LNCaP-Pro5, and LNCaP-LN3 panel of cells represent a model of androgen-responsive prostate cancer, while the PC-3, PC-3M, and PC-3M-LN4 panel represent a model of androgen-insensitive disease. RESULTS: Of the 245 unique proteins identified and quantified (>or=95% confidence; >or=2 peptides/protein), 17 showed significant differential expression (>or=+/-1.5), in at least one of the variant LNCaP cells relative to parental cells. Similarly, comparisons within the PC-3 panel identified 45 proteins to show significant differential expression in at least one of the variant PC-3 cells compared with parental cells. Differential expression of selected candidates was verified by Western blotting or immunocytochemistry, and corresponding mRNA expression was determined by quantitative real-time PCR (qRT-PCR). Immunostaining of prostate tissue microarrays for ERp5, one of the candidates identified, showed a significant higher immunoexpression in pre-malignant lesions compared with non-malignant epithelium (P < 0.0001, Mann-Whitney U-test), and in high Gleason grade (4-5) versus low grade (2-3) cancers (P < 0.05). CONCLUSIONS: Our study provides proof of principle for the application of an 8-plex iTRAQ approach to uncover clinically relevant candidate biomarkers for prostate cancer progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The family of theories dubbed ‘luck egalitarianism’ represent an attempt to infuse egalitarian thinking with a concern for personal responsibility, arguing that inequalities are just when they result from, or the extent to which they result from, choice, but are unjust when they result from, or the extent to which they result from, luck. In this essay I argue that luck egalitarians should sometimes seek to limit inequalities, even when they have a fully choice-based pedigree (i.e., result only from the choices of agents). I grant that the broad approach is correct but argue that the temporal standpoint from which we judge whether the person can be held responsible, or the extent to which they can be held responsible, should be radically altered. Instead of asking, as Standard (or Static) Luck Egalitarianism seems to, whether or not, or to what extent, a person was responsible for the choice at the time of choosing, and asking the question of responsibility only once, we should ask whether, or to what extent, they are responsible for the choice at the point at which we are seeking to discover whether, or to what extent, the inequality is just, and so the question of responsibility is not settled but constantly under review. Such an approach will differ from Standard Luck Egalitarianism only if responsibility for a choice is not set in stone – if responsibility can weaken then we should not see the boundary between luck and responsibility within a particular action as static. Drawing on Derek Parfit’s illuminating discussions of personal identity, and contemporary literature on moral responsibility, I suggest there are good reasons to think that responsibility can weaken – that we are not necessarily fully responsible for a choice for ever, even if we were fully responsible at the time of choosing. I call the variant of luck egalitarianism that recognises this shift in temporal standpoint and that responsibility can weaken Dynamic Luck Egalitarianism (DLE). In conclusion I offer a preliminary discussion of what kind of policies DLE would support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A flood warning system incorporates telemetered rainfall and flow/water level data measured at various locations in the catchment area. Real-time accurate data collection is required for this use, and sensor networks improve the system capabilities. However, existing sensor nodes struggle to satisfy the hydrological requirements in terms of autonomy, sensor hardware compatibility, reliability and long-range communication. We describe the design and development of a real-time measurement system for flood monitoring, and its deployment in a flash-flood prone 650 km2 semiarid watershed in Southern Spain. A developed low-power and long-range communication device, so-called DatalogV1, provides automatic data gathering and reliable transmission. DatalogV1 incorporates self-monitoring for adapting measurement schedules for consumption management and to capture events of interest. Two tests are used to assess the success of the development. The results show an autonomous and robust monitoring system for long-term collection of water level data in many sparse locations during flood events.