46 resultados para non-conscious cognitive processing (NCCP) time.

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Event-related potentials (ERP) have been proposed to improve the differential diagnosis of non-responsive patients. We investigated the potential of the P300 as a reliable marker of conscious processing in patients with locked-in syndrome (LIS). Eleven chronic LIS patients and 10 healthy subjects (HS) listened to a complex-tone auditory oddball paradigm, first in a passive condition (listen to the sounds) and then in an active condition (counting the deviant tones). Seven out of nine HS displayed a P300 waveform in the passive condition and all in the active condition. HS showed statistically significant changes in peak and area amplitude between conditions. Three out of seven LIS patients showed the P3 waveform in the passive condition and five of seven in the active condition. No changes in peak amplitude and only a significant difference at one electrode in area amplitude were observed in this group between conditions. We conclude that, in spite of keeping full consciousness and intact or nearly intact cortical functions, compared to HS, LIS patients present less reliable results when testing with ERP, specifically in the passive condition. We thus strongly recommend applying ERP paradigms in an active condition when evaluating consciousness in non-responsive patients.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When facing a crisis, leaders' sensemaking can take a considerable amount of time due to the need to develop consensus in how to deal with it so that vision formation and sensegiving can take place. However, research into emerging cognitive consensus when leaders deal with a crisis over time is lacking. This is limiting a detailed understanding of how organizations respond to crises. The findings, based on a longitudinal analysis of cognitive maps within three management teams at a single organization, highlight considerable individual differences in cognitive content when starting to make sense of a crisis. Evidence for an emerging viable prescriptive mental model for the future was found, but not so much in the management as a whole. Instead, the findings highlight increasing cognitive consensus based on similarities in objectives and cause-effect beliefs within well-defined management teams over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We summarize the various strands of research on peripheral vision and relate them to theories of form perception. After a historical overview, we describe quantifications of the cortical magnification hypothesis, including an extension of Schwartz's cortical mapping function. The merits of this concept are considered across a wide range of psychophysical tasks, followed by a discussion of its limitations and the need for non-spatial scaling. We also review the eccentricity dependence of other low-level functions including reaction time, temporal resolution, and spatial summation, as well as perimetric methods. A central topic is then the recognition of characters in peripheral vision, both at low and high levels of contrast, and the impact of surrounding contours known as crowding. We demonstrate how Bouma's law, specifying the critical distance for the onset of crowding, can be stated in terms of the retinocortical mapping. The recognition of more complex stimuli, like textures, faces, and scenes, reveals a substantial impact of mid-level vision and cognitive factors. We further consider eccentricity-dependent limitations of learning, both at the level of perceptual learning and pattern category learning. Generic limitations of extrafoveal vision are observed for the latter in categorization tasks involving multiple stimulus classes. Finally, models of peripheral form vision are discussed. We report that peripheral vision is limited with regard to pattern categorization by a distinctly lower representational complexity and processing speed. Taken together, the limitations of cognitive processing in peripheral vision appear to be as significant as those imposed on low-level functions and by way of crowding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We summarize the various strands of research on peripheral vision and relate them to theories of form perception. After a historical overview, we describe quantifications of the cortical magnification hypothesis, including an extension of Schwartz's cortical mapping function. The merits of this concept are considered across a wide range of psychophysical tasks, followed by a discussion of its limitations and the need for non-spatial scaling. We also review the eccentricity dependence of other low-level functions including reaction time, temporal resolution, and spatial summation, as well as perimetric methods. A central topic is then the recognition of characters in peripheral vision, both at low and high levels of contrast, and the impact of surrounding contours known as crowding. We demonstrate how Bouma's law, specifying the critical distance for the onset of crowding, can be stated in terms of the retinocortical mapping. The recognition of more complex stimuli, like textures, faces, and scenes, reveals a substantial impact of mid-level vision and cognitive factors. We further consider eccentricity-dependent limitations of learning, both at the level of perceptual learning and pattern category learning. Generic limitations of extrafoveal vision are observed for the latter in categorization tasks involving multiple stimulus classes. Finally, models of peripheral form vision are discussed. We report that peripheral vision is limited with regard to pattern categorization by a distinctly lower representational complexity and processing speed. Taken together, the limitations of cognitive processing in peripheral vision appear to be as significant as those imposed on low-level functions and by way of crowding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metallocene catalyzed linear low density polyethylene (m-LLDPE) is a new generation of olefin copolymer. Based on the more recently developed metallocene-type catalysts, m-LLDPE can be synthesized with exactly controlled short chain branches and stereo-regular microstructure. The unique properties of these polymers have led to their applications in many areas. As a result, it is important to have a good understanding of the oxidation mechanism of m-LLDPE during melt processing in order to develop more effective stabilisation systems and continue to increase the performance of the material. The primary objectives of this work were, firstly, to investigate the oxidative degradation mechanisms of m-LLDPE polymers having different comonomer (I-octene) content during melt processing. Secondly, to examine the effectiveness of some commercial antioxidants on the stabilisation of m-LLDPE melt. A Ziegler-polymerized LLDPE (z-LLDPE) based on the same comonomer was chosen and processed under the same conditions for comparison with the metallocene polymers. The LLDPE polymers were processed using an internal mixer (torque rheometer, TR) and a co-rotating twin-screw extruder (TSE). The effects of processing variables (time, temperature) on the rheological (MI, MWD, rheometry) and molecular (unsaturation type and content, carbonyl compounds, chain branching) characteristics of the processed polymers were examined. It was found that the catalyst type (metallocene or Ziegler) and comonomer content of the polymers have great impact on their oxidative degradation behavior (crosslinking or chain scission) during melt processing. The metallocene polymers mainly underwent chain scission at lower temperature (<220°C) but crosslinking became predominant at higher temperature for both TR and TSE processed polymers. Generally, the more comonomers the m-LLDPE contains, a larger extent of chain scission can be expected. In contrast, crosslinking reactions were shown to be always dominant in the case of the Ziegler LLDPE. Furthermore, it is clear that the molecular weight distribution (MWD) of all LLDPE became broader after processing and tended generally to be broader at elevated temperatures and more extrusion passes. So, it can be concluded that crosslinking and chain scission are temperature dependent and occur simultaneously as competing reactions during melt processing. Vinyl is considered to be the most important unsaturated group leading to polymer crosslinking as its concentration in all the LLDPE decreased after processing. Carbonyl compounds were produced during LLDPE melt processing and ketones were shown to be the most imp0l1ant carbonyl-containing products in all processed polymers. The carbonyl concentration generally increased with temperature and extrusion passes, and the higher carbonyl content fonned in processed z-LLDPE and m-LLDPE polymers having higher comonomer content indicates their higher susceptibility of oxidative degradation. Hindered phenol and lactone antioxidants were shown to be effective in the stabilization of m-LLDPE melt when they were singly used in TSE extrusion. The combination of hindered phenol and phosphite has synergistic effect on m-LLDPE stabilization and the phenol-phosphite-Iactone mixture imparted the polymers with good stability during extrusion, especially for m-LLDPE with higher comonomer content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: Subclinical hypothyroidism (SCH) and cognitive dysfunction are both common in the elderly and have been linked. It is important to determine whether T4 replacement therapy in SCH confers cognitive benefit. Objective: Our objective was to determine whether administration of T4 replacement to achieve biochemical euthyroidism in subjects with SCH improves cognitive function. Design and Setting: We conducted a double-blind placebo-controlled randomized controlled trial in the context of United Kingdom primary care. Patients: Ninety-four subjects aged 65 yr and over (57 females, 37 males) with SCH were recruited from a population of 147 identified by screening. Intervention: T4 or placebo was given at an initial dosage of one tablet of either placebo or 25 µg T4 per day for 12 months. Thyroid function tests were performed at 8-weekly intervals with dosage adjusted in one-tablet increments to achieve TSH within the reference range for subjects in treatment arm. Fifty-two subjects received T4 (31 females, 21 males; mean age 73.5 yr, range 65–94 yr); 42 subjects received placebo (26 females, 16 males; mean age 74.2 yr, 66–84 yr). Main Outcome Measures: Mini-Mental State Examination, Middlesex Elderly Assessment of Mental State (covering orientation, learning, memory, numeracy, perception, attention, and language skills), and Trail-Making A and B were administered. Results: Eighty-two percent and 84% in the T4 group achieved euthyroidism at 6- and 12-month intervals, respectively. Cognitive function scores at baseline and 6 and 12 months were as follows: Mini-Mental State Examination T4 group, 28.26, 28.9, and 28.28, and placebo group, 28.17, 27.82, and 28.25 [not significant (NS)]; Middlesex Elderly Assessment of Mental State T4 group, 11.72, 11.67, and 11.78, and placebo group, 11.21, 11.47, and 11.44 (NS); Trail-Making A T4 group, 45.72, 47.65, and 44.52, and placebo group, 50.29, 49.00, and 46.97 (NS); and Trail-Making B T4 group, 110.57, 106.61, and 96.67, and placebo group, 131.46, 119.13, and 108.38 (NS). Linear mixed-model analysis demonstrated no significant changes in any of the measures of cognitive function over time and no between-group difference in cognitive scores at 6 and 12 months. Conclusions: This RCT provides no evidence for treating elderly subjects with SCH with T4 replacement therapy to improve cognitive function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we evaluate and compare two representativeand popular distributed processing engines for large scalebig data analytics, Spark and graph based engine GraphLab. Wedesign a benchmark suite including representative algorithmsand datasets to compare the performances of the computingengines, from performance aspects of running time, memory andCPU usage, network and I/O overhead. The benchmark suite istested on both local computer cluster and virtual machines oncloud. By varying the number of computers and memory weexamine the scalability of the computing engines with increasingcomputing resources (such as CPU and memory). We also runcross-evaluation of generic and graph based analytic algorithmsover graph processing and generic platforms to identify thepotential performance degradation if only one processing engineis available. It is observed that both computing engines showgood scalability with increase of computing resources. WhileGraphLab largely outperforms Spark for graph algorithms, ithas close running time performance as Spark for non-graphalgorithms. Additionally the running time with Spark for graphalgorithms over cloud virtual machines is observed to increaseby almost 100% compared to over local computer clusters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent functional magnetic resonance imaging (fMRI) investigations of the interaction between cognition and reward processing have found that the lateral prefrontal cortex (PFC) areas are preferentially activated to both increasing cognitive demand and reward level. Conversely, ventromedial PFC (VMPFC) areas show decreased activation to the same conditions, indicating a possible reciprocal relationship between cognitive and emotional processing regions. We report an fMRI study of a rewarded working memory task, in which we further explore how the relationship between reward and cognitive processing is mediated. We not only assess the integrity of reciprocal neural connections between the lateral PFC and VMPFC brain regions in different experimental contexts but also test whether additional cortical and subcortical regions influence this relationship. Psychophysiological interaction analyses were used as a measure of functional connectivity in order to characterize the influence of both cognitive and motivational variables on connectivity between the lateral PFC and the VMPFC. Psychophysiological interactions revealed negative functional connectivity between the lateral PFC and the VMPFC in the context of high memory load, and high memory load in tandem with a highly motivating context, but not in the context of reward alone. Physiophysiological interactions further indicated that the dorsal anterior cingulate and the caudate nucleus modulate this pathway. These findings provide evidence for a dynamic interplay between lateral PFC and VMPFC regions and are consistent with an emotional gating role for the VMPFC during cognitively demanding tasks. Our findings also support neuropsychological theories of mood disorders, which have long emphasized a dysfunctional relationship between emotion/motivational and cognitive processes in depression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Common problems encountered in clinical sensing are those of non-biocompatibility, and slow response time of the device. The latter, also applying to chemical sensors, is possibly due to a lack of understanding of polymer support or membrane properties and hence failure to optimise membranes chosen for specific sensor applications. Hydrogels can be described as polymers which swell in water. In addition to this, the presence of water in the polymer matrix offers some control of biocompatibility. They thus provide a medium through which rapid transport of a sensed species to an incorporated reagent could occur. This work considers the feasibility of such a system, leading to the design and construction of an optical sensor test bed. The development of suitable membrane systems and of suitable coating techniques in order to apply them to the fibre optics is described. Initial results obtained from hydrogel coatings implied that the refractive index change in the polymer matrix, due to a change in water content with pH is the major factor contributing to the sensor response. However the presence of the colourimetric reagent was also altering the output signal obtained. An analysis of factors contributing to the overall response, such as colour change and membrane composition were made on both the test bed, via optical response, and on whole membranes via measurement of water content change. The investigation of coatings with low equilibrium water contents, of less than 10% was carried out and in fact a clearer signal response from the test bed was noted. Again these membranes were suprisingly responding via refractive index change, with the reagent playing a primary role in obtaining a sensible or non-random response, although not in a colourimetric fashion. A photographic study of these coatings revealed some clues as to the physical nature of these coatings and hence partially explained this phenomenon. A study of the transport properties of the most successful membrane, on a coated wire electrode and also on the fibre optic test bed, in a series of test environments, indicated that the reagent was possibly acting as an ion exchanger and hence having a major influence on transport and therefore sensor characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing theories of semantic cognition propose models of cognitive processing occurring in a conceptual space, where ‘meaning’ is derived from the spatial relationships between concepts’ mapped locations within the space. Information visualisation is a growing area of research within the field of information retrieval, and methods for presenting database contents visually in the form of spatial data management systems (SDMSs) are being developed. This thesis combined these two areas of research to investigate the benefits associated with employing spatial-semantic mapping (documents represented as objects in two- and three-dimensional virtual environments are proximally mapped dependent on the semantic similarity of their content) as a tool for improving retrieval performance and navigational efficiency when browsing for information within such systems. Positive effects associated with the quality of document mapping were observed; improved retrieval performance and browsing behaviour were witnessed when mapping was optimal. It was also shown using a third dimension for virtual environment (VE) presentation provides sufficient additional information regarding the semantic structure of the environment that performance is increased in comparison to using two-dimensions for mapping. A model that describes the relationship between retrieval performance and browsing behaviour was proposed on the basis of findings. Individual differences were not found to have any observable influence on retrieval performance or browsing behaviour when mapping quality was good. The findings from this work have implications for both cognitive modelling of semantic information, and for designing and testing information visualisation systems. These implications are discussed in the conclusions of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses data assimilation, which typically refers to the estimation of the state of a physical system given a model and observations, and its application to short-term precipitation forecasting. A general introduction to data assimilation is given, both from a deterministic and' stochastic point of view. Data assimilation algorithms are reviewed, in the static case (when no dynamics are involved), then in the dynamic case. A double experiment on two non-linear models, the Lorenz 63 and the Lorenz 96 models, is run and the comparative performance of the methods is discussed in terms of quality of the assimilation, robustness "in the non-linear regime and computational time. Following the general review and analysis, data assimilation is discussed in the particular context of very short-term rainfall forecasting (nowcasting) using radar images. An extended Bayesian precipitation nowcasting model is introduced. The model is stochastic in nature and relies on the spatial decomposition of the rainfall field into rain "cells". Radar observations are assimilated using a Variational Bayesian method in which the true posterior distribution of the parameters is approximated by a more tractable distribution. The motion of the cells is captured by a 20 Gaussian process. The model is tested on two precipitation events, the first dominated by convective showers, the second by precipitation fronts. Several deterministic and probabilistic validation methods are applied and the model is shown to retain reasonable prediction skill at up to 3 hours lead time. Extensions to the model are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: To determine the validity and reliability of the measurement of corneal curvature and non-invasive tear break-up time (NITBUT) measures using the Oculus Keratograph. METHOD: One hundred eyes of 100 patients had their corneal curvature assessed with the Keratograph and the Nidek ARKT TonorefII. NITBUT was then measured objectively with the Keratograph with Tear Film Scan software and subjectively with the Keeler Tearscope. The Keratograph measurements of corneal curvature and NITBUT were repeated to test reliability. The ocular sensitivity disease index questionnaire was completed to quantify ocular comfort. RESULTS: The Keratograph consistently measured significantly flatter corneal curvatures than the ARKT (MSE difference: +1.83±0.44D), but was repeatable (p>0.05). Keratograph NITBUT measurements were significantly lower than observation using the Tearscope (by 12.35±7.45s; pp < 0.001) and decreased on subsequent measurement (by -1.64 ± 6.03s; p < 0.01). The Keratograph measures the first time the tears break up anywhere on the cornea with 63% of subjects having NI-TBUT's <5s and a further 22% having readings between 5 and 10s. The Tearscope results were found to correlate better with the patients symptoms (r = -0.32) compared to the Keratograph (r = -0.19). Conclusions: The Keratograph requires a calibration off-set to be comparable to other keratometry devices. Its current software detects very early tear film changes, recording significantly lower NITBUT values than conventional subjective assessment. Adjustments to instrumentation software have the potential to enhance the value of Keratograph objective measures in clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximately half of current contact lens wearers suffer from dryness and discomfort, particularly towards the end of the day. Contact lens practitioners have a number of dry eye tests available to help them to predict which of their patients may be at risk of contact lens drop out and advise them accordingly. This thesis set out to rationalize them to see if any are of more diagnostic significance than others. This doctorate has found: (1) The Keratograph, a device which permits an automated, examiner independent technique for measuring non invasive tear break up time (NITBUT) measured NITBUT consistently shorter than measurements recorded with the Tearscope. When measuring central corneal curvature the spherical equivalent power of the cornea was measured as being significantly flatter than with a validated automated keratometer. (2) Non-invasive and invasive tear break-up times significantly correlated to each other, but not the other tear metrics. Symptomology, assessed using the OSDI questionnaire, correlated more with those tests indicating possible damage to the ocular surface (including LWE, LIPCOF and conjunctival staining) than with tests of either tear volume or stability. Cluster analysis showed some statistically significant groups of patients with different sign and symptom profiles. The largest cluster demonstrated poor tear quality with both non-invasive and invasive tests, low tear volume and more symptoms. (3) Care should be taken in fitting patients new to contact lenses if they have a NITBUT less than 10s or an OSDI comfort rating greater than 4.2 as they are more likely to drop-out within the first 6 months. Cluster analysis was not found to be beneficial in predicting which patients will succeed with lenses and which will not. A combination of the OSDI questionnaire and a NITBUT measurement was most useful both in diagnosing dry eye and in predicting contact lens drop out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To quantify the end-of-day silicone-hydrogel daily disposable contact lens fit and its influence of on ocular comfort, physiology and lens wettability. Methods: Thirty-nine subjects (22.1. ±. 3.5 years) were randomised to wear each of 3 silicone-hydrogel daily-disposable contact lenses (narafilcon A, delefilcon A and filcon II 3), bilaterally, for one week. Lens fit was assessed objectively using a digital video slit-lamp at 8, 12 and 16. h after lens insertion. Hyperaemia, non-invasive tear break-up time, tear meniscus height and comfort were also evaluated at these timepoints, while corneal and conjunctival staining were assessed on lens removal. Results: Lens fit assessments were not different between brands (P > 0.05), with the exception of the movement at blink where narafilcon A was more mobile. Overall, lag reduced but push-up speed increased from 8 to 12. h (P <. 0.05), but remained stable from 12 to 16. h (P > 0.05). Movement-on-blink was unaffected by wear-time (F = 0.403, P = 0.670). A more mobile lens fit with one brand did not indicate that person would have a more mobile fit with another brand (r = -0.06 to 0.63). Lens fit was not correlated with comfort, ocular physiology or lens wettability (P > 0.01). Conclusions: Among the lenses tested, objective lens fit changed between 8. h and 12. h of lens wear. The weak correlation in individual lens fit between brands indicates that fit is dependent on more than ocular shape. Consequently, substitution of a different lens brand with similar parameters will not necessarily provide comparable lens fit.