8 resultados para VISUAL INSPECTION
em Aston University Research Archive
Resumo:
Background: Vigabatrin (VGB) is an anti-epileptic medication which has been linked to peripheral constriction of the visual field. Documenting the natural history associated with continued VGB exposure is important when making decisions about the risk and benefits associated with the treatment. Due to its speed the Swedish Interactive Threshold Algorithm (SITA) has become the algorithm of choice when carrying out Full Threshold automated static perimetry. SITA uses prior distributions of normal and glaucomatous visual field behaviour to estimate threshold sensitivity. As the abnormal model is based on glaucomatous behaviour this algorithm has not been validated for VGB recipients. We aim to assess the clinical utility of the SITA algorithm for accurately mapping VGB attributed field loss. Methods: The sample comprised one randomly selected eye of 16 patients diagnosed with epilepsy, exposed to VGB therapy. A clinical diagnosis of VGB attributed visual field loss was documented in 44% of the group. The mean age was 39.3 years∈±∈14.5 years and the mean deviation was -4.76 dB ±4.34 dB. Each patient was examined with the Full Threshold, SITA Standard and SITA Fast algorithm. Results: SITA Standard was on average approximately twice as fast (7.6 minutes) and SITA Fast approximately 3 times as fast (4.7 minutes) as examinations completed using the Full Threshold algorithm (15.8 minutes). In the clinical environment, the visual field outcome with both SITA algorithms was equivalent to visual field examination using the Full Threshold algorithm in terms of visual inspection of the grey scale plots, defect area and defect severity. Conclusions: Our research shows that both SITA algorithms are able to accurately map visual field loss attributed to VGB. As patients diagnosed with epilepsy are often vulnerable to fatigue, the time saving offered by SITA Fast means that this algorithm has a significant advantage for use with VGB recipients.
Resumo:
Objective: It is investigated to which extent measures of nonlinearity derived from surrogate data analysis are capable to quantify the changes of epileptic activity related to varying vigilance levels. Methods: Surface and intracranial EEG from foramen ovale (FO-)electrodes was recorded from a patient with temporal lobe epilepsy under presurgical evaluation over one night. Different measures of nonlinearity were estimated for non-overlapping 30-s segments for selected channels from surface and intracranial EEG. Additionally spectral measures were calculated. Sleep stages were scored according to Rechtschaffen/Kales and epileptic transients were counted and classified by visual inspection. Results: In the intracranial recordings stronger nonlinearity was found ipsilateral to the epileptogenic focus, more pronounced in NREM sleep, weaker in REM sleep. The dynamics within the NREM episodes varied with the different nonlinearity measures. Some nonlinearity measures showed variations with the sleep cycle also in the intracranial recordings contralateral to the epileptic focus and in the surface EEG. It is shown that the nonlinearity is correlated with short-term fluctuations of the delta power. The higher frequency of occurrence of clinical relevant epileptic spikes in the first NREM episode was not clearly reflected in the nonlinearity measures. Conclusions: It was confirmed that epileptic activity renders the EEG nonlinear. However, it was shown that the sleep dynamics itself also effects the nonlinearity measures. Therefore, at the present stage it is not possible to establish a unique connection between the studied nonlinearity measures and specific types of epileptic activity in sleep EEG recordings.
Resumo:
Fast pyrolysis liquid or bio-oil has been used in engines with limited success. It requires a pilot fuel and/or an additive for successful combustion and there are problems with materials and liquid properties. It is immiscible with all conventional hydrocarbon fuels. Biodiesel, a product of esterification of vegetable oil with an alcohol, is widely used as a renewable liquid fuel as an additive to diesel at up to 20%. There are however limits to its use in conventional engines due to poor low temperature performance and variability in quality from a variety of vegetable oil qualities and variety of esterification processes. Within the European Project Bioliquids-CHP - a joint project between the European Commission and Russia - a study was undertaken to develop small scale CHP units based on engines and microturbines fuelled with bioliquids from fast pyrolysis and methyl esters of vegetable oil. Blends of bio-oil and biodiesel were evaluated and tested to overcome some of the disadvantages of using either fuel by itself. An alcohol was used as the co-solvent in the form of ethanol, 1-butanol or 2-propanol. Visual inspection of the blend homogeneity after 48 h was used as an indicator of the product stability and the results were plotted in a three phase chart for each alcohol used. An accelerated stability test was performed on selected samples in order to predict its long term stability. We concluded that the type and quantity of alcohol is critical for the blend formation and stability. Using 1-butanol gave the widest selection of stable blends, followed by blends with 2-propanol and finally ethanol, thus 1-butanol blends accepted the largest proportion of bio-oil in the mixture. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
A visualization plot of a data set of molecular data is a useful tool for gaining insight into a set of molecules. In chemoinformatics, most visualization plots are of molecular descriptors, and the statistical model most often used to produce a visualization is principal component analysis (PCA). This paper takes PCA, together with four other statistical models (NeuroScale, GTM, LTM, and LTM-LIN), and evaluates their ability to produce clustering in visualizations not of molecular descriptors but of molecular fingerprints. Two different tasks are addressed: understanding structural information (particularly combinatorial libraries) and relating structure to activity. The quality of the visualizations is compared both subjectively (by visual inspection) and objectively (with global distance comparisons and local k-nearest-neighbor predictors). On the data sets used to evaluate clustering by structure, LTM is found to perform significantly better than the other models. In particular, the clusters in LTM visualization space are consistent with the relationships between the core scaffolds that define the combinatorial sublibraries. On the data sets used to evaluate clustering by activity, LTM again gives the best performance but by a smaller margin. The results of this paper demonstrate the value of using both a nonlinear projection map and a Bernoulli noise model for modeling binary data.
Resumo:
The aim of this work was to investigate the feasibility of detecting and locating damage in large frame structures where visual inspection would be difficult or impossible. This method is based on a vibration technique for non-destructively assessing the integrity of structures by using measurements of changes in the natural frequencies. Such measurements can be made at a single point in the structure. The method requires that initially a comprehensive theoretical vibration analysis of the structure is undertaken and from it predictions are made of changes in dynamic characteristics that will occur if each member of the structure is damaged in turn. The natural frequencies of the undamaged structure are measured, and then routinely remeasured at intervals . If a change in the natural frequencies is detected a statistical method. is used to make the best match between the measured changes in frequency and the family of theoretical predictions. This predicts the most likely damage site. The theoretical analysis was based on the finite element method. Many structures were extensively studied and a computer model was used to simulate the effect of the extent and location of the damage on natural frequencies. Only one such analysis is required for each structure to be investigated. The experimental study was conducted on small structures In the laboratory. Frequency changes were found from inertance measurements on various plane and space frames. The computational requirements of the location analysis are small and a desk-top micro computer was used. Results of this work showed that the method was successful in detecting and locating damage in the test structures.
Discriminating antigen and non-antigen using proteome dissimilarity III:tumour and parasite antigens
Resumo:
Computational genome analysis enables systematic identification of potential immunogenic proteins within a pathogen. Immunogenicity is a system property that arises through the interaction of host and pathogen as mediated through the medium of a immunogenic protein. The overt dissimilarity of pathogenic proteins when compared to the host proteome is conjectured by some to be the determining principal of immunogenicity. Previously, we explored this idea in the context of Bacterial, Viral, and Fungal antigen. In this paper, we broaden and extend our analysis to include complex antigens of eukaryotic origin, arising from tumours and from parasite pathogens. For both types of antigen, known antigenic and non-antigenic protein sequences were compared to human and mouse proteomes. In contrast to our previous results, both visual inspection and statistical evaluation indicate a much wider range of homologues and a significant level of discrimination; but, as before, we could not determine a viable threshold capable of properly separating non-antigen from antigen. In concert with our previous work, we conclude that global proteome dissimilarity is not a useful metric for immunogenicity for presently available antigens arising from Bacteria, viruses, fungi, parasites, and tumours. While we see some signal for certain antigen types, using dissimilarity is not a useful approach to identifying antigenic molecules within pathogen genomes.
Resumo:
We consider the effects of salt (sodium iodide) on pristine carbon nanotube (CNT) dispersions in an organic solvent, N-methyl-2-pyrrolidone (NMP). We investigate the molecular-scale mechanisms of ion interactions with the nanotube surface and we show how the microscopic ion-surface interactions affect the stability of CNT dispersions in NMP. In our study we use a combination of fully atomistic Molecular Dynamics simulations of sodium and iodide ions at the CNT-NMP interface with direct experiments on the CNT dispersions. In the experiments we analyze the effects of salt on the stability of the dispersions by photoluminescence (PL) and optical absorption spectroscopy of the samples as well as by visual inspection. By fully atomistic Molecular Dynamics simulations we investigate the molecular-scale mechanisms of sodium and iodide ion interactions with the nanotube surface. Our simulations reveal that both ions are depleted from the CNT surface in the CNT-NMP dispersions mainly due to the two reasons: (1) there is a high energy penalty for the ion partial desolvation at the CNT surface; (2) NMP molecules form a dense solvation layer at the CNT surface that prevents ions to come close to the CNT surface. As a result, an increase of the salt concentration increases the "osmotic" stress in the CNT-NMP system and, thus, decreases the stability of the CNT dispersions in NMP. Direct experiments confirm the simulation results: addition of NaI salt into the NMP dispersions of pristine CNTs leads to precipitation of CNTs (bundle formation) even at very small salt concentration (∼10 -3 mol L -1). In line with the simulation predictions, the effect increases with the increase of the salt concentration. Overall, our results show that dissolved salt ions have strong effects on the stability of CNT dispersions. Therefore, it is possible to stimulate the bundle formation in the CNT-NMP dispersions and regulate the overall concentration of nanotubes in the dispersions by changing the NaI concentration in the solvent. © 2012 The Royal Society of Chemistry.
Resumo:
In this study we aim to evaluate the impact of ageing and gender on different visual mental imagery processes. Two hundred and fifty-one participants (130 women and 121 men; age range = 18–77 years) were given an extensive neuropsychological battery including tasks probing the generation, maintenance, inspection, and transformation of visual mental images (Complete Visual Mental Imagery Battery, CVMIB). Our results show that all mental imagery processes with the exception of the maintenance are affected by ageing, suggesting that other deficits, such as working memory deficits, could account for this effect. However, the analysis of the transformation process, investigated in terms of mental rotation and mental folding skills, shows a steeper decline in mental rotation, suggesting that age could affect rigid transformations of objects and spare non-rigid transformations. Our study also adds to previous ones in showing gender differences favoring men across the lifespan in the transformation process, and, interestingly, it shows a steeper decline in men than in women in inspecting mental images, which could partially account for the mixed results about the effect of ageing on this specific process. We also discuss the possibility to introduce the CVMIB in clinical assessment in the context of theoretical models of mental imagery.