893 resultados para non-conscious cognitive processing (NCCP) time.


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A variety of iron compounds containing vinyl or thiol functional groups (used as photoactivators) have been synthesised and some of these were successfully bound to both polyethylene and polypropylene backbones during processing in the presence of peroxide and interlinking agent. Concentrates (masterbatches) of the photoactivators in PP and PE were prepared and the pro-oxidant effect of the diluted masterbatches in absence and presence of an antioxidant was evaluated. An antioxidant photoactivator (FeDNC ) was found to sensitise the photoactivity of pro-oxidants (Metone A / Metone M) whereas an antioxidant (ZnDNC) was found to stabilise the polymer (PP and PE) containing both of these combinations. It was observed that the lower concentration of FeDNC sensitises the stability of the polymer containing very small concentration of NiDNC whereas higher concentration of FeDNC stabilises the polymer (LDPE) containing same amount of NiDNC compared to FeDNC alone. The photostability of unstabilised PP containing FeAc could be varied by varying the concentration of ZnDEC. Both the induction period and the UV - life time of the polymer increased by increasing concentration of ZnDEC. It is suggested that ligand exchange reaction may take place between FeAc and ZnDNC. A polymer bound UV stabiliser (HAEB) and a thermal stabiliser (DBBA) were used with a non extractable photoactivator (FeAc) in PP. Small concentrations of the stabilisers (HAEB and DBBA) in combination with the photoactivator (FeAc) sensitise the polymer. The antioxidant present in commercial polymer (LDPE and PP) was found to be of a hindered phenol type, which was found to antagonise with ZnDNC when used in combination with the photoactivators.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study examined the extent to which students could fake responses on personality and approaches to studying questionnaires, and the effects of such responding on the validity of non-cognitive measures for predicting academic performance (AP). University students produced a profile of an ‘ideal’ student using the Big-Five personality taxonomy, which yielded a stereotype with low scores for Neuroticism, and high scores for the other four traits. A sub-set of participants were allocated to a condition in which they were instructed to fake their responses as University applicants, portraying themselves as positively as possible. Scores for these participants revealed higher scores than those in a control condition on measures of deep and strategic approaches to studying, but lower scores on the surface approach variable. Conscientiousness was a significant predictor of AP in both groups, but the predictive effect of approaches to studying variables and Openness to Experience identified in the control group was lower in the group who faked their responses. Non-cognitive psychometric measures can be valid predictors of AP, but scores on these measures can be affected by instructional set. Further implications for psychometric measurement in educational settings are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The possibility that developmental dyslexia results from low-level sensory processing deficits has received renewed interest in recent years. Opponents of such sensory-based explanations argue that dyslexia arises primarily from phonological impairments. However, many behavioural correlates of dyslexia cannot be explained sufficiently by cognitive-level accounts and there is anatomical, psychometric and physiological evidence of sensory deficits in the dyslexic population. This thesis aims to determine whether the low-level (pre-attentive) processing of simple auditory stimuli is disrupted in compensated adult dyslexics. Using psychometric and neurophysiological measures, the nature of auditory processing abnormalities is investigated. Group comparisons are supported by analysis of individual data in order to address the issue of heterogeneity in dyslexia. The participant pool consisted of seven compensated dyslexic adults and seven age and IQ matched controls. The dyslexic group were impaired, relative to the control group, on measures of literacy, phonological awareness, working memory and processing speed. Magnetoencephalographic recordings were conducted during processing of simple, non-speech, auditory stimuli. Results confirm that low-level auditory processing deficits are present in compensated dyslexic adults. The amplitude of N1m responses to tone pair stimuli were reduced in the dyslexic group. However, there was no evidence that manipulating either the silent interval or the frequency separation between the tones had a greater detrimental effect on dyslexic participants specifically. Abnormal MMNm responses were recorded in response to frequency deviant stimuli in the dyslexic group. In addition, complete stimulus omissions, which evoked MMNm responses in all control participants, failed to elicit significant MMNm responses in all but one of the dyslexic individuals. The data indicate both a deficit of frequency resolution at a local level of auditory processing and a higher-level deficit relating to the grouping of auditory stimuli, relevant for auditory scene analysis. Implications and directions for future research are outlined.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Huge advertising budgets are invested by firms to reach and convince potential consumers to buy their products. To optimize these investments, it is fundamental not only to ensure that appropriate consumers will be reached, but also that they will be in appropriate reception conditions. Marketing research has focused on the way consumers react to advertising, as well as on some individual and contextual factors that could mediate or moderate the ad impact on consumers (e.g. motivation and ability to process information or attitudes toward advertising). Nevertheless, a factor that potentially influences consumers’ advertising reactions has not yet been studied in marketing research: fatigue. Fatigue can yet impact key variables of advertising processing, such as cognitive resources availability (Lieury 2004). Fatigue is felt when the body warns to stop an activity (or inactivity) to have some rest, allowing the individual to compensate for fatigue effects. Dittner et al. (2004) defines it as “the state of weariness following a period of exertion, mental or physical, characterized by a decreased capacity for work and reduced efficiency to respond to stimuli.’’ It signals that resources will lack if we continue with the ongoing activity. According to Schmidtke (1969), fatigue leads to troubles in information reception, in perception, in coordination, in attention getting, in concentration and in thinking. In addition, for Markle (1984) fatigue generates a decrease in memory, and in communication ability, whereas it increases time reaction, and number of errors. Thus, fatigue may have large effects on advertising processing. We suggest that fatigue determines the level of available resources. Some research about consumer responses to advertising claim that complexity is a fundamental element to take into consideration. Complexity determines the cognitive efforts the consumer must provide to understand the message (Putrevu et al. 2004). Thus, we suggest that complexity determines the level of required resources. To study this complex question about need and provision of cognitive resources, we draw upon Resource Matching Theory. Anand and Sternthal (1989, 1990) are the first to state the Resource Matching principle, saying that an ad is most persuasive when the resources required to process it match the resources the viewer is willing and able to provide. They show that when the required resources exceed those available, the message is not entirely processed by the consumer. And when there are too many available resources comparing to those required, the viewer elaborates critical or unrelated thoughts. According to the Resource Matching theory, the level of resource demanded by an ad can be high or low, and is mostly determined by the ad’s layout (Peracchio and Myers-Levy, 1997). We manipulate the level of required resources using three levels of ad complexity (low – high – extremely high). On the other side, the resource availability of an ad viewer is determined by lots of contextual and individual variables. We manipulate the level of available resources using two levels of fatigue (low – high). Tired viewers want to limit the processing effort to minimal resource requirements by making heuristics, forming overall impression at first glance. It will be easier for them to decode the message when ads are very simple. On the contrary, the most effective ads for viewers who are not tired are complex enough to draw their attention and fully use their resources. They will use more analytical strategies, looking at the details of the ad. However, if ads are too complex, they will be too difficult to understand. The viewer will be discouraged to process information and will overlook the ad. The objective of our research is to study fatigue as a moderating variable of advertising information processing. We run two experimental studies to assess the effect of fatigue on visual strategies, comprehension, persuasion and memorization. In study 1, thirty-five undergraduate students enrolled in a marketing research course participated in the experiment. The experimental design is 2 (tiredness level: between subjects) x 3 (ad complexity level: within subjects). Participants were randomly assigned a schedule time (morning: 8-10 am or evening: 10-12 pm) to perform the experiment. We chose to test subjects at various moments of the day to obtain maximum variance in their fatigue level. We use Morningness / Eveningness tendency of participants (Horne & Ostberg, 1976) as a control variable. We assess fatigue level using subjective measures - questionnaire with fatigue scales - and objective measures - reaction time and number of errors. Regarding complexity levels, we have designed our own ads in order to keep aspects other than complexity equal. We ran a pretest using the Resource Demands scale (Keller and Bloch 1997) and by rating them on complexity like Morrison and Dainoff (1972) to check for our complexity manipulation. We found three significantly different levels. After having completed the fatigue scales, participants are asked to view the ads on a screen, while their eye movements are recorded by the eye-tracker. Eye-tracking allows us to find out patterns of visual attention (Pieters and Warlop 1999). We are then able to infer specific respondents’ visual strategies according to their level of fatigue. Comprehension is assessed with a comprehension test. We collect measures of attitude change for persuasion and measures of recall and recognition at various points of time for memorization. Once the effect of fatigue will be determined across the student population, it is interesting to account for individual differences in fatigue severity and perception. Therefore, we run study 2, which is similar to the previous one except for the design: time of day is now within-subjects and complexity becomes between-subjects

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have proposed a new technique of all-optical nonlinear pulse processing for use at a RZ optical receiver, which is based on an AM or any device with a similar function of temporal gating/slicing enhanced by the effect of Kerr nonlinearity in a NDF. The efficiency of the technique has been demonstrated by application to timing jitter and noise-limited RZ transmission at 40 Gbit/s. Substantial suppression of the signal timing jitter and overall improvement of the receiver performance has been demonstrated using the proposed method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent advances in our ability to watch the molecular and cellular processes of life in action-such as atomic force microscopy, optical tweezers and Forster fluorescence resonance energy transfer-raise challenges for digital signal processing (DSP) of the resulting experimental data. This article explores the unique properties of such biophysical time series that set them apart from other signals, such as the prevalence of abrupt jumps and steps, multi-modal distributions and autocorrelated noise. It exposes the problems with classical linear DSP algorithms applied to this kind of data, and describes new nonlinear and non-Gaussian algorithms that are able to extract information that is of direct relevance to biological physicists. It is argued that these new methods applied in this context typify the nascent field of biophysical DSP. Practical experimental examples are supplied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a new all-optical signal processing technique to enhance the performance of a return-to-zero optical receiver, which is based on nonlinear temporal pulse broadening and flattening in a normal dispersion fiber and subsequent slicing of the pulse temporal waveform. The potential of the method is demonstrated by application to timing jitter-and noise-limited transmission at 40 Gbit/s. © 2005 Optical Society of America.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sensory processing is a crucial underpinning of the development of social cognition, a function which is compromised in variable degree in patients with pervasive developmental disorders (PDD). In this manuscript, we review some of the most recent and relevant contributions, which have looked at auditory sensory processing derangement in PDD. The variability in the clinical characteristics of the samples studied so far, in terms of severity of the associated cognitive deficits and associated limited compliance, underlying aetiology and demographic features makes a univocal interpretation arduous. We hypothesise that, in patients with severe mental deficits, the presence of impaired auditory sensory memory as expressed by the mismatch negativity could be a non-specific indicator of more diffuse cortical deficits rather than causally related to the clinical symptomatology. More consistent findings seem to emerge from studies on less severely impaired patients, in whom increased pitch perception has been interpreted as an indicator of increased local processing, probably as compensatory mechanism for the lack of global processing (central coherence). This latter hypothesis seems extremely attractive and future trials in larger cohorts of patients, possibly standardising the characteristics of the stimuli are a much-needed development. Finally, specificity of the role of the auditory derangement as opposed to other sensory channels needs to be assessed more systematically using multimodal stimuli in the same patient group. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we propose to increase residual carrier frequency offset tolerance based on short perfect reconstruction pulse shaping for coherent optical-orthogonal frequency division multiplexing. The proposed method suppresses the residual carrier frequency offset induced penalty at the receiver, without requiring any additional overhead and exhaustive signal processing. The Q-factor improvement contributed by the proposed method is 1.6 dB and 1.8 dB for time-frequency localization maximization and out-of-band energy minimization pulse shapes, respectively. Finally, the transmission span gain under the influence of residual carrier frequency offset is ̃62% with out-of-band energy minimization pulse shape. © 2014 Optical Society of America.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The human mirror neuron system (MNS) has recently been a major topic of research in cognitive neuroscience. As a very basic reflection of the MNS, human observers are faster at imitating a biological as compared with a non-biological movement. However, it is unclear which cortical areas and their interactions (synchronization) are responsible for this behavioural advantage. We investigated the time course of long-range synchronization within cortical networks during an imitation task in 10 healthy participants by means of whole-head magnetoencephalography (MEG). Extending previous work, we conclude that left ventrolateral premotor, bilateral temporal and parietal areas mediate the observed behavioural advantage of biological movements in close interaction with the basal ganglia and other motor areas (cerebellum, sensorimotor cortex). Besides left ventrolateral premotor cortex, we identified the right temporal pole and the posterior parietal cortex as important junctions for the integration of information from different sources in imitation tasks that are controlled for movement (biological vs. non-biological) and that involve a certain amount of spatial orienting of attention. Finally, we also found the basal ganglia to participate at an early stage in the processing of biological movement, possibly by selecting suitable motor programs that match the stimulus.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.