1000 resultados para quantum memory
Resumo:
Graffiti, Memory and Contested Space: Mnemonic Initiatives Following Periods of Trauma and/or Repression in Buenos Aires, Argentina This thesis concerns the popular articulation ofmemory following periods or incidents of trauma in Argentina. I am interested in how groups lay claim to various public spaces in the city and how they convert these spaces into mnemonic battlegrounds. In considering these spaces of trauma and places of memory, I am primarily interested in how graffiti writing (stencils, spray-paint, signatures, etchings, wall-paintings, murals and installations) is used to make these spaces transmit particular memories that impugn official versions of the past. This thesis draws on literatures focused on popular/public memory. Scholars argue that memory is socially constructed and thus actively contested. Marginal initiatives such as graffiti writing challenge the memory projects of the state as well as state projects that are perceived by citizens to be 'inadequate,' 'inappropriate,' and/or as promoting the erasure of memory. Many of these initiatives are a reaction to the proreconciliation and pro-oblivion strategies of previous governments. I outline that the history of silences and impunity, and a longstanding emphasis on reconciliation at the expense of truth and justice has created an environment of vulnerable memory in Argentina. Popular memory entrepreneurs react by aggressively articulating their memories in time and in space. As a result of this intense memory work, the built landscape in Buenos Aires is dotted with mnemonic initiatives that aim to contradict or subvert officially sanctioned memories. I also suggest that memory workers in Argentina persistently and carefially use the sites of trauma as well as key public spaces to ensure official as well as popular audiences . The data for this project was collected in five spaces in Buenos Aires, the Plaza de Mayo, Plaza Congreso, La Republica Cromanon nightclub, Avellaneda Train Station and El Olimpo, a former detention centre from the military dictatorship.
Resumo:
Fluid inteliigence has been defined as an innate ability to reason which is measured commonly by the Raven's Progressive Matrices (RPM). Individual differences in fluid intelligence are currently explained by the Cascade model (Fry & Hale, 1996) and the Controlled Attention hypothesis (Engle, Kane, & Tuholski, 1999; Kane & Engle, 2002). The first theory is based on a complex relation among age, speed, and working memory which is described as a Cascade. The alternative to this theory, the Controlled Attention hypothesis, is based on the proposition that it is the executive attention component of working memory that explains performance on fluid intelligence tests. The first goal of this study was to examine whether the Cascade model is consistent within the visuo-spatial and verbal-numerical modalities. The second goal was to examine whether the executive attention component ofworking memory accounts for the relation between working memory and fluid intelligence. Two hundred and six undergraduate students between the ages of 18 and 28 completed a battery of cognitive tests selected to measure processing speed, working memory, and controlled attention which were selected from two cognitive modalities, verbalnumerical and visuo-spatial. These were used to predict performance on two standard measures of fluid intelligence: the Raven's Progressive Matrices (RPM) and the Shipley Institute of Living Scales (SILS) subtests. Multiple regression and Structural Equation Modeling (SEM) were used to test the Cascade model and to determine the independent and joint effects of controlled attention and working memory on general fluid intelligence. Among the processing speed measures only spatial scan was related to the RPM. No other significant relations were observed between processing speed and fluid intelligence. As 1 a construct, working memory was related to the fluid intelligence tests. Consistent with the predictions for the RPM there was support for the Cascade model within the visuo-spatial modality but not within the verbal-numerical modality. There was no support for the Cascade model with respect to the SILS tests. SEM revealed that there was a direct path between controlled attention and RPM and between working memory and RPM. However, a significant path between set switching and RPM explained the relation between controlled attention and RPM. The prediction that controlled attention mediated the relation between working memory and RPM was therefore not supported. The findings support the view that the Cascade model may not adequately explain individual differences in fluid intelligence and this may be due to the differential relations observed between working memory and fluid intelligence across different modalities. The findings also show that working memory is not a domain-general construct and as a result its relation with fluid intelligence may be dependent on the nature of the working memory modality.
Resumo:
Reduced capacity for executive cognitive function and for the autonomic control of cardiac responsivity are both concomitants of the aging process. These may be linked through their mutual dependence on medial prefrontal function, but the specifics ofthat linkage have not been well explored. Executive functions associated with medial prefrontal cortex involve various aspects ofperformance monitoring, whereas centrally mediated autonomic functions can be observed as heart rate variability (HRV), i.e., variability in the length of intervals between heart beats. The focus for this thesis was to examine the degree to which the capacity for phasic autonomic adjustments to heart rate relates to performance monitoring in younger and older adults, using measures of electrocortical and autonomic activity. Behavioural performance and attention allocation during two age-sensitive tasks could be predicted by various aspects of autonomic control. For young adults, greater influence of the parasympathetic system on HRV was beneficial for learning unfamiliar maze paths; for older adults, greater sympathetic influence was detrimental to these functions. Further, these relationships were primarily evoked when the task required the construction and use of internalized representations of mazes rather than passive responses to feedback. When memory for source was required, older adults made three times as many source errors as young adults. However, greater parasympathetic influence on HRV in the older group was conducive to avoiding source errors and to reduced electrocortical responses to irrelevant information. Higher sympathetic predominance, in contrast, was associated with higher rates of source error and greater electrocortical responses tq non-target information in both groups. These relations were not seen for 11 errors associated with a speeded perceptual task, irrespective of its difficulty level. Overall, autonomic modulation of cardiac activity was associated with higher levels of performance monitoring, but differentially across tasks and age groups. With respect to age, those older adults who had maintained higher levels of autonomic cardiac regulation appeared to have also maintained higher levels of executive control over task performance.
Resumo:
The aim of this study was to investigate the neural correlates of operant conditioning in a semi-intact preparation of the pond snail, Lymnaea stagnalis. Lymnaea learns, via operant conditioning, to reduce its aerial respiratory behaviour in response to an aversive tactile stimulus to its open pneumostome. This thesis demonstrates the successful conditioning of na'ive semiintact preparations to show learning in the dish. Furthermore, these conditioned preparations show long-term memory that persists for at least 18 hours. As the neurons that generate this behaviour have been previously identified I can, for the first time, monitor neural activity during both learning and long-term memory consolidation in the same preparation. In particular, I record from the respiratory neuron Right Pedal Dorsal 1 (RPeD 1) which is part of the respiratory central pattern generator. In this study, I demonstrate that preventing RPeDl impulse activity between training sessions reduces the number of sessions needed to produce long-term memory in the present semi-intact preparation.
Resumo:
Four problems of physical interest have been solved in this thesis using the path integral formalism. Using the trigonometric expansion method of Burton and de Borde (1955), we found the kernel for two interacting one dimensional oscillators• The result is the same as one would obtain using a normal coordinate transformation, We next introduced the method of Papadopolous (1969), which is a systematic perturbation type method specifically geared to finding the partition function Z, or equivalently, the Helmholtz free energy F, of a system of interacting oscillators. We applied this method to the next three problems considered• First, by summing the perturbation expansion, we found F for a system of N interacting Einstein oscillators^ The result obtained is the same as the usual result obtained by Shukla and Muller (1972) • Next, we found F to 0(Xi)f where A is the usual Tan Hove ordering parameter* The results obtained are the same as those of Shukla and Oowley (1971), who have used a diagrammatic procedure, and did the necessary sums in Fourier space* We performed the work in temperature space• Finally, slightly modifying the method of Papadopolous, we found the finite temperature expressions for the Debyecaller factor in Bravais lattices, to 0(AZ) and u(/K/ j,where K is the scattering vector* The high temperature limit of the expressions obtained here, are in complete agreement with the classical results of Maradudin and Flinn (1963) .
Resumo:
Methods for both partial and full optimization of wavefunction parameters are explored, and these are applied to the LiH molecule. A partial optimization can be easily performed with little difficulty. But to perform a full optimization we must avoid a wrong minimum, and deal with linear-dependency, time step-dependency and ensemble-dependency problems. Five basis sets are examined. The optimized wavefunction with a 3-function set gives a variational energy of -7.998 + 0.005 a.u., which is comparable to that (-7.990 + 0.003) 1 of Reynold's unoptimized \fin ( a double-~ set of eight functions). The optimized wavefunction with a double~ plus 3dz2 set gives ari energy of -8.052 + 0.003 a.u., which is comparable with the fixed-node energy (-8.059 + 0.004)1 of the \fin. The optimized double-~ function itself gives an energy of -8.049 + 0.002 a.u. Each number above was obtained on a Bourrghs 7900 mainframe computer with 14 -15 hrs CPU time.
Resumo:
In Part I, theoretical derivations for Variational Monte Carlo calculations are compared with results from a numerical calculation of He; both indicate that minimization of the ratio estimate of Evar , denoted EMC ' provides different optimal variational parameters than does minimization of the variance of E MC • Similar derivations for Diffusion Monte Carlo calculations provide a theoretical justification for empirical observations made by other workers. In Part II, Importance sampling in prolate spheroidal coordinates allows Monte Carlo calculations to be made of E for the vdW molecule var He2' using a simplifying partitioning of the Hamiltonian and both an HF-SCF and an explicitly correlated wavefunction. Improvements are suggested which would permit the extension of the computational precision to the point where an estimate of the interaction energy could be made~
Resumo:
Nanoporous materials with large surface area and well-ordered pore structure have been synthesized. Thiol groups were grafted on the materials' surface to make heavy metal ion pre-concentration media. The adsorption properties ofthe materials were explored. Mercury, gold and silver can be strongly adsorbed by these materials, even in the presence of alkaline earth metal ion. Though the materials can adsorb other heavy metal ions such as lead and copper, they show differential adsorption ability when several ions are present in solution. The adsorption sequence is: mercury> == silver> copper » lead and cadmium. In the second part of this work, the memory effects of mercury, gold, silver and boron were investigated. The addition of 2% L-cysteine and 1% thiourea eliminates the problems of the three metal ions completely. The wash-out time for mercury dropped from more than 20 minutes to 18 seconds, and the wash-out time for gold decreased from more than 30 minutes to 49 seconds. The memory effect of boron can be reduced by the use of mannitol.
Resumo:
This study examined the effectiveness of motor-encoding activities on memory and performance of students in a Grade One reading program. There were two experiments in the study. Experiment 1 replicated a study by Eli Saltz and David Dixon (1982). The effect of motoric enactment (Le., pretend play) of sentences on memory for the sentences was investigated. Forty Grade One students performed a "memory-for-sentences" technique, devised by Saltz and Dixon. Only the experimental group used motoric enactment of the sentences. Although quantitative findings revealed no significant difference between the mean scores of the experimental group versus the control group, aspects of the experimental design could have affected the results. It was suggested that Saltz and Dixon's study could be replicated again, with more attention given to variables such as population size, nature of the test sentences, subjects' previous educational experience and conditions related to the testing environment. The second experiment was an application of Saltz and Dixon's theory that motoric imagery should facilitate memory for sentences. The intent was to apply this theory to Grade One students' ability to remember words from their reading program. An experimental gym program was developed using kinesthetic activities to reinforce the skills of the classroom reading program. The same subject group was used in Experiment 2. It was hypothesized that the subjects who experienced the experimental gym program would show greater signs of progress in reading ability, as evidenced by their scores on Form G of the Woodcock Reading Mastery Test--Revised. The data from the WRM--R were analyzed with a 3-way split-plot analysis of variance in which group (experimental vs. control) and sex were the between subjects variables and test-time (pre-test vs. post-test) was the within-subjects variable. Findings revealed the following: (a) both groups made substantial gains over time on the visual-auditory learning sub-test and the triple action of group x sex x time also was significant; (b) children in the experimental and control groups performed similarly on both the pre- and post-test of the letter identification test; (c) time was the only significant effect on subjects' performance on the word identification task; (d) work attack scores showed marked improvement in performance over time for both the experimenta+ and control groups; (e) passage comprehension scores indicated an improvement in performance for both groups over time. Similar to Experiment 1, it is suggested that several modifications in the experimental design could produce significant results. These factors are addressed with suggestions for further research in the area of active learning; more specifically, the effect of motor-encoding activities on memory and academic performance of children.
Resumo:
Traditional psychometric theory and practice classify people according to broad ability dimensions but do not examine how these mental processes occur. Hunt and Lansman (1975) proposed a 'distributed memory' model of cognitive processes with emphasis on how to describe individual differences based on the assumption that each individual possesses the same components. It is in the quality of these components ~hat individual differences arise. Carroll (1974) expands Hunt's model to include a production system (after Newell and Simon, 1973) and a response system. He developed a framework of factor analytic (FA) factors for : the purpose of describing how individual differences may arise from them. This scheme is to be used in the analysis of psychometric tes ts . Recent advances in the field of information processing are examined and include. 1) Hunt's development of differences between subjects designated as high or low verbal , 2) Miller's pursuit of the magic number seven, plus or minus two, 3) Ferguson's examination of transfer and abilities and, 4) Brown's discoveries concerning strategy teaching and retardates . In order to examine possible sources of individual differences arising from cognitive tasks, traditional psychometric tests were searched for a suitable perceptual task which could be varied slightly and administered to gauge learning effects produced by controlling independent variables. It also had to be suitable for analysis using Carroll's f ramework . The Coding Task (a symbol substitution test) found i n the Performance Scale of the WISe was chosen. Two experiments were devised to test the following hypotheses. 1) High verbals should be able to complete significantly more items on the Symbol Substitution Task than low verbals (Hunt, Lansman, 1975). 2) Having previous practice on a task, where strategies involved in the task may be identified, increases the amount of output on a similar task (Carroll, 1974). J) There should be a sUbstantial decrease in the amount of output as the load on STM is increased (Miller, 1956) . 4) Repeated measures should produce an increase in output over trials and where individual differences in previously acquired abilities are involved, these should differentiate individuals over trials (Ferguson, 1956). S) Teaching slow learners a rehearsal strategy would improve their learning such that their learning would resemble that of normals on the ,:same task. (Brown, 1974). In the first experiment 60 subjects were d.ivided·into high and low verbal, further divided randomly into a practice group and nonpractice group. Five subjects in each group were assigned randomly to work on a five, seven and nine digit code throughout the experiment. The practice group was given three trials of two minutes each on the practice code (designed to eliminate transfer effects due to symbol similarity) and then three trials of two minutes each on the actual SST task . The nonpractice group was given three trials of two minutes each on the same actual SST task . Results were analyzed using a four-way analysis of variance . In the second experiment 18 slow learners were divided randomly into two groups. one group receiving a planned strategy practioe, the other receiving random practice. Both groups worked on the actual code to be used later in the actual task. Within each group subjects were randomly assigned to work on a five, seven or nine digit code throughout. Both practice and actual tests consisted on three trials of two minutes each. Results were analyzed using a three-way analysis of variance . It was found in t he first experiment that 1) high or low verbal ability by itself did not produce significantly different results. However, when in interaction with the other independent variables, a difference in performance was noted . 2) The previous practice variable was significant over all segments of the experiment. Those who received previo.us practice were able to score significantly higher than those without it. J) Increasing the size of the load on STM severely restricts performance. 4) The effect of repeated trials proved to be beneficial. Generally, gains were made on each successive trial within each group. S) In the second experiment, slow learners who were allowed to practice randomly performed better on the actual task than subjeots who were taught the code by means of a planned strategy. Upon analysis using the Carroll scheme, individual differences were noted in the ability to develop strategies of storing, searching and retrieving items from STM, and in adopting necessary rehearsals for retention in STM. While these strategies may benef it some it was found that for others they may be harmful . Temporal aspects and perceptual speed were also found to be sources of variance within individuals . Generally it was found that the largest single factor i nfluencing learning on this task was the repeated measures . What e~ables gains to be made, varies with individuals . There are environmental factors, specific abilities, strategy development, previous learning, amount of load on STM , perceptual and temporal parameters which influence learning and these have serious implications for educational programs .
Resumo:
The infinitesimal differential quantum Monte Carlo (QMC) technique is used to estimate electrostatic polarizabilities of the H and He atoms up to the sixth order in the electric field perturbation. All 542 different QMC estimators of the nonzero atomic polarizabilities are derived and used in order to decrease the statistical error and to obtain the maximum efficiency of the simulations. We are confident that the estimates are "exact" (free of systematic error): the two atoms are nodeless systems, hence no fixed-node error is introduced. Furthermore, we develope and use techniques which eliminate systematic error inherent when extrapolating our results to zero time-step and large stack-size. The QMC results are consistent with published accurate values obtained using perturbation methods. The precision is found to be related to the number of perturbations, varying from 2 to 4 significant digits.
Resumo:
A presentation made at the CAUT Librarians Conference in Ottawa, Ontario in October 2005.
Resumo:
Photosynthesis in general is a key biological process on Earth and Photo system II (PSII) is an important component of this process. PSII is the only enzyme capable of oxidizing water and is largely responsible for the primordial build-up and present maintenance of the oxygen in the atmosphere. This thesis endeavoured to understand the link between structure and function in PSII with special focus on primary photochemistry, repair/photodamage and spectral characteristics. The deletion of the PsbU subunit ofPSII in cyanobacteria caused a decoupling of the Phycobilisomes (PBS) from PSII, likely as a result of increased rates of PSII photodamage with the PBS decoupling acting as a measure to protect PSII from further damage. Isolated fractions of spinach thylakoid membranes were utilized to characterize the heterogeneity present in the various compartments of the thylakoid membrane. It was found that the pooled PSIILHCII pigment populations were connected in the grana stack and there was also a progressive decrease in the reaction rates of primary photochemistry and antennae size of PSII as the sample origin moved from grana to stroma. The results were consistent with PSII complexes becoming damaged in the grana and being sent to the stroma for repair. The dramatic quenching of variable fluorescence and overall fluorescent yield of PSII in desiccated lichens was also studied in order to investigate the mechanism by which the quenching operated. It was determined that the source of the quenching was a novel long wavelength emitting external quencher. Point mutations to amino acids acting as ligands to chromophores of interest in PSII were utilized in cyanobacteria to determine the role of specific chromophores in energy transfer and primary photochemistry. These results indicated that the Hl14 ligated chlorophyll acts as the 'trap' chlorophyll in CP47 at low temperature and that the Q130E mutation imparts considerable changes to PSII electron transfer kinetics, essentially protecting the complex via increased non-radiative charge Photosynthesis in general is a key biological process on Earth and Photo system II (PSII) is an important component of this process. PSII is the only enzyme capable of oxidizing water and is largely responsible for the primordial build-up and present maintenance of the oxygen in the atmosphere. This thesis endeavoured to understand the link between structure and function in PSII with special focus on primary photochemistry, repair/photodamage and spectral characteristics. The deletion of the PsbU subunit ofPSII in cyanobacteria caused a decoupling of the Phycobilisomes (PBS) from PSII, likely as a result of increased rates of PSII photodamage with the PBS decoupling acting as a measure to protect PSII from further damage. Isolated fractions of spinach thylakoid membranes were utilized to characterize the heterogeneity present in the various compartments of the thylakoid membrane. It was found that the pooled PSIILHCII pigment populations were connected in the grana stack and there was also a progressive decrease in the reaction rates of primary photochemistry and antennae size of PSII as the sample origin moved from grana to stroma. The results were consistent with PSII complexes becoming damaged in the grana and being sent to the stroma for repair. The dramatic quenching of variable fluorescence and overall fluorescent yield of PSII in desiccated lichens was also studied in order to investigate the mechanism by which the quenching operated. It was determined that the source of the quenching was a novel long wavelength emitting external quencher. Point mutations to amino acids acting as ligands to chromophores of interest in PSII were utilized in cyanobacteria to determine the role of specific chromophores in energy transfer and primary photochemistry. These results indicated that the Hl14 ligated chlorophyll acts as the 'trap' chlorophyll in CP47 at low temperature and that the Q130E mutation imparts considerable changes to PSII electron transfer kinetics, essentially protecting the complex via increased non-radiative charge.
Resumo:
Cognitive control involves the ability to flexibly adjust cognitive processing in order to resist interference and promote goal-directed behaviour. Although frontal cortex is considered to be broadly involved in cognitive control, the mechanisms by which frontal brain areas implement control functions are unclear. Furthermore, aging is associated with reductions in the ability to implement control functions and questions remain as to whether unique cortical responses serve a compensatory role in maintaining maximal performance in later years. Described here are three studies in which electrophysiological data were recorded while participants performed modified versions of the standard Sternberg task. The goal was to determine how top-down control is implemented in younger adults and altered in aging. In study I, the effects of frequent stimulus repetition on the interference-related N450 were investigated in a Sternberg task with a small stimulus set (requiring extensive stimulus resampling) and a task with a large stimulus set (requiring no stimulus resampling).The data indicated that constant stimulus res amp ling required by employing small stimulus sets can undercut the effect of proactive interference on the N450. In study 2, younger and older adults were tested in a standard version of the Sternberg task to determine whether the unique frontal positivity, previously shown to predict memory impairment in older adults during a proactive interference task, would be associated with the improved performance when memory recognition could be aided by unambiguous stimulus familiarity. Here, results indicated that the frontal positivity was associated with poorer memory performance, replicating the effect observed in a more cognitively demanding task, and showing that stimulus familiarity does not mediate compensatory cortical activations in older adults. Although the frontal positivity could be interpreted to reflect maladaptive cortical activation, it may also reflect attempts at compensation that fail to fully ameliorate agerelated decline. Furthermore, the frontal positivity may be the result of older adults' reliance on late occurring, controlled processing in contrast to younger adults' ability to identify stimuli at very early stages of processing. In the final study, working memory load was manipulated in the proactive interference Sternberg task in order to investigate whether the N450 reflects simple interference detection, with little need for cognitive resources, or an active conflict resolution mechanism that requires executive resources to implement. Independent component analysis was used to isolate the effect of interference revealing that the canonical N450 was based on two dissociable cognitive control mechanisms: a left frontal negativity that reflects active interference resolution, , but requires executive resources to implement, and a right frontal negativity that reflects global response inhibition that can be relied on when executive resources are minimal but at the cost of a slowed response. Collectively, these studies advance understanding of the factors that influence younger and older adults' ability to satisfy goal-directed behavioural requirements in the face of interference and the effects of age-related cognitive decline.
Resumo:
This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.