877 resultados para Sites of memory
Resumo:
Recent evidence suggests that humans can form and later retrieve new semantic relations unconsciously by way of hippocampus-the key structure also recruited for conscious relational (episodic) memory. If the hippocampus subserves both conscious and unconscious relational encoding/retrieval, one would expect the hippocampus to be place of unconscious-conscious interactions during memory retrieval. We tested this hypothesis in an fMRI experiment probing the interaction between the unconscious and conscious retrieval of face-associated information. For the establishment of unconscious relational memories, we presented subliminal (masked) combinations of unfamiliar faces and written occupations ("actor" or "politician"). At test, we presented the former subliminal faces, but now supraliminally, as cues for the reactivation of the unconsciously associated occupations. We hypothesized that unconscious reactivation of the associated occupation-actor or politician-would facilitate or inhibit the subsequent conscious retrieval of a celebrity's occupation, which was also actor or politician. Depending on whether the reactivated unconscious occupation was congruent or incongruent to the celebrity's occupation, we expected either quicker or delayed conscious retrieval process. Conscious retrieval was quicker in the congruent relative to a neutral baseline condition but not delayed in the incongruent condition. fMRI data collected during subliminal face-occupation encoding confirmed previous evidence that the hippocampus was interacting with neocortical storage sites of semantic knowledge to support relational encoding. fMRI data collected at test revealed that the facilitated conscious retrieval was paralleled by deactivations in the hippocampus and neocortical storage sites of semantic knowledge. We assume that the unconscious reactivation has pre-activated overlapping relational representations in the hippocampus reducing the neural effort for conscious retrieval. This finding supports the notion of synergistic interactions between conscious and unconscious relational memories in a common, cohesive hippocampal-neocortical memory space.
Resumo:
Background: Little research has been conducted to assess the effect of using memory training with school-aged children who were born very preterm. This study aimed to determine whether two types of memory training approaches resulted in an improvement of trained functions and/or a generalization of the training effect to non-trained cognitive domains. Methods: Sixty-eight children born very preterm (7¬-12 years) were randomly allocated to a group undertaking memory strategy training (n=23), working memory training (n=22), or a waiting control group (n=23). Neuropsychological assessment was performed before and immediately after the training or waiting period, and at a six-month follow-up. Results: In both training groups, significant improvement of different memory domains occurred immediately after training (near transfer). Improvement of non-trained arithmetic performance was observed after strategy training (far transfer). At a six-month follow-up assessment, children in both training groups demonstrated better working memory, and their parents rated their memory functions to be better than controls. Performance level before the training was negatively associated with the training gain. Conclusions: These results highlight the importance of cognitive interventions, in particular the teaching of memory strategies, in very preterm-born children at early school age to strengthen cognitive performance and prevent problems at school.
Resumo:
Background: Little research has been conducted to assess the effect of using memory training with school aged children who were born very preterm. This study aimed to determine whether two types of memory training approaches resulted in an improvement of trained functions and/or a generalization of the training effect to non-trained cognitive domains. Methods: Sixty-eight children born very preterm (7-12 years) were randomly allocated to a group undertaking memory strategy training (n=23), working memory training (n=22), or a waiting control group (n=23). Neuropsychological assessment was performed before and immediately after the training or waiting period, and at a six-month follow-up. Results: In both training groups, significant improvement of different memory domains occurred immediately after training (near transfer). Improvement of non-trained arithmetic performance was observed after strategy training (far transfer). At a six-month follow-up assessment, children in both training groups demonstrated better working memory, and their parents rated their memory functions to be better than controls. Performance level before the training was negatively associated with the training gain. Conclusions: These results highlight the importance of cognitive interventions, in particular the teaching of memory strategies, in very preterm-born children at early school age to strengthen cognitive performance and prevent problems at school.
Resumo:
Neuronal outgrowth has been proposed in many systems as a mechanism underlying memory storage. For example, sensory neuron outgrowth is widely accepted as an underlying mechanism of long-term sensitization of defensive withdrawal reflexes in Aplysia. The hypothesis is that learning leads to outgrowth and consequently to the formation of new synapses, which in turn strengthen the neural circuit underlying the behavior. However, key experiments to test this hypothesis have never been performed. ^ Four days of sensitization training leads to outgrowth of siphon sensory neurons mediating the siphon-gill withdrawal response in Aplysia . We found that a similar training protocol produced robust outgrowth in tail sensory neurons mediating the tail siphon withdrawal reflex. In contrast, 1 day of training, which effectively induces long-term behavioral sensitization and synaptic facilitation, was not associated with neuronal outgrowth. Further examination of the effect of behavioral training protocols on sensory neuron outgrowth indicated that this structural modification is associated only with the most persistent forms of sensitization, and that the induction of these changes is dependent on the spacing of the training trials over multiple days. Therefore, we suggest that neuronal outgrowth is not a universal mechanism underlying long-term sensitization, but is involved only in the most persistent forms of the memory. ^ Sensory neuron outgrowth presumably contributes to long-term sensitization through formation of new synapses with follower motor neurons, but this hypothesis has never been directly tested. The contribution of outgrowth to long-term sensitization was assessed using confocal microscopy to examine sites of contact between physiologically connected pairs of sensory and motor neurons. Following 4 days of training, the strength of both the behavior and sensorimotor synapse and the number of appositions with follower neurons was enhanced only on the trained side of the animal. In contrast, outgrowth was induced on both sides of the animal, indicating that although sensory neuron outgrowth does appear to contribute to sensitization through the formation of new synapses, outgrowth alone is not sufficient to account for the effects of sensitization. This indicates that key regulatory steps are downstream from outgrowth, possibly in the targeting of new processes and activation of new synapses. ^
Resumo:
Compressional (Vp) and shear (Vs) wave velocities have been measured to 10 kb in 32 cores of basalt from 14 Pacific sites of the Deep Sea Drilling Project. Both Vp and V s show wide ranges (3.70 to 6.38 km/sec for Vp and 1.77 to 3.40 km/sec for V s at 0.5 kb) which are linearly related to density and sea floor age, confirming earlier findings by Christensen and Salisbury of decreasing velocity with progressive submarine weathering based on studies of basalts from five sites in the Atlantic. Combined Pacific and Atlantic data give rates of decreasing velocity of -1.89 and -1.35 km/sec per 100 my for Vp and Vs respectively. New analyses of oceanic seismic refraction data indicate a decrease in layer 2 velocities with age similar to that observed in the laboratory, suggesting that weathering penetrates to several hundred meters in many regions and is largely responsible for the extreme range and variability of layer 2 refraction velocities.
Resumo:
Inoceramus is an epibenthic bivalve which lived in a wide variety of paleoenvironments encompassing a broad range of paleodepths. A survey of all Cretaceous sediments from Deep Sea Drilling Project legs 1-69 and 75 revealed over 500 Inoceramus specimens at twenty sites. Of these, 47 well-preserved Late Cretaceous specimens from the South Atlantic, Pacific and Indian Oceans were analyzed for oxygen and carbon isotopes. The specimens exhibit small internal isotopic variability and oxygen isotopic paleotemperatures that are consistent with a deep-sea habitat. Paleotemperatures ranging from 5 to 16°C show that Late Cretaceous oceans were significantly warmer than the present oceans. The data suggest that deep water was formed both by cooling at high latitudes and by evaporation in the subtropics.
Resumo:
Calcareous nannofossils were studied by light microscopy in Neogene sedimentary rocks recovered at four sites of the Ocean Drilling Program Leg 127 in the Japan Sea. Nannofossils occur sporadically at all sites, and allow recognition of seven zones and two subzones; four zones in the Holocene to the uppermost Pliocene, and three zones and two subzones in the middle to lower Miocene. Forty-eight nannofossil species are recognized in 95 of the 808 irregularly-spaced samples taken from all the sites. The nannofossil assemblages in the Miocene are more diverse than those in the Holocene to Pliocene sedimentary interval. The greater diversity and the presence of warm-water taxa, such as Sphenolithus and discoasters in the upper lower Miocene to lower middle Miocene, suggest a relatively warm and stable surface-water condition, attributed to an increased supply of warm water from the subtropical western Pacific Ocean. Site 797 in the southern part of the Yamato Basin contains the most complete and the oldest nannofossil record so far reported from the Japan Sea. The lowermost nannofossil zone at this site, the Helicosphaera ampliaperta Zone (15.7-18.4 Ma) gives a minimum age for the Yamato Basin. This age range predates rotation of southwest Japan, an event previously believed to be caused by the opening of the Japan Sea.
Resumo:
Chemical analyses were performed on major, minor, and rare-earth elements of pelagic and hemipelagic sediments of the forearc, arc, and backarc sites of the Izu-Bonin Arc, Ocean Drilling Program Leg 126. Analyses of the hemipelagic and pelagic sediments of this area indicate that the chemical composition of this arc is highly affected by the chemical composition of rocks of this arc as a source of sediments. The Oligocene sediments, which are characterized by high MgO contents, reflect the chemical composition of the Paleogene volcanic rocks of the immature arc. Moreover, the late Miocene to Quaternary sediments with low MgO contents are attributed to the composition of the present arc. We also suggest that the sedimentation rates and topography of the sedimentary basin affect the MnO and SiO2 contents of pelagic and hemipelagic sediments.
Resumo:
Late Cretaceous (Maastrichtian)-Quaternary summary biostratigraphies are presented for Ocean Drilling Program (ODP) Leg 189 Sites 1168 (West Tasmanian Margin), 1170 and 1171 (South Tasman Rise), and 1172 (East Tasman Plateau). The age models are calibrated to magnetostratigraphy and integrate both calcareous (planktonic foraminifers and nannofossils) and siliceous (diatoms and radiolarians) microfossil groups with organic walled microfossils (organic walled dinoflagellate cysts, or dinocysts). We also incorporate benthic oxygen isotope stratigraphies into the upper Quaternary parts of the age models for further control. The purpose of this paper is to provide a summary age-depth model for all deep-penetrating sites of Leg 189 incorporating updated shipboard biostratigraphic data with new information obtained during the 3 yr since the cruise. In this respect we provide a report of work to November 2003, not a final synthesis of the biomagnetostratigraphy of Leg 189, yet we present the most complete integrated age model for these sites at this time. Detailed information of the stratigraphy of individual fossil groups, paleomagnetism, and isotope data are presented elsewhere. Ongoing efforts aim toward further integration of age information for Leg 189 sites and will include an attempt to correlate zonation schemes for all the major microfossil groups and detailed correlation between all sites.
Resumo:
In this paper, we examine the issue of memory management in the parallel execution of logic programs. We concentrate on non-deterministic and-parallel schemes which we believe present a relatively general set of problems to be solved, including most of those encountered in the memory management of or-parallel systems. We present a distributed stack memory management model which allows flexible scheduling of goals. Previously proposed models (based on the "Marker model") are lacking in that they impose restrictions on the selection of goals to be executed or they may require consume a large amount of virtual memory. This paper first presents results which imply that the above mentioned shortcomings can have significant performance impacts. An extension of the Marker Model is then proposed which allows flexible scheduling of goals while keeping (virtual) memory consumption down. Measurements are presented which show the advantage of this solution. Methods for handling forward and backward execution, cut and roll back are discussed in the context of the proposed scheme. In addition, the paper shows how the same mechanism for flexible scheduling can be applied to allow the efficient handling of the very general form of suspension that can occur in systems which combine several types of and-parallelism and more sophisticated methods of executing logic programs. We believe that the results are applicable to many and- and or-parallel systems.
Resumo:
The application of conservation treatments, such as consolidation and protection ones, has been demonstrated ineffective in many cases, and even harmful. Evaluation studies should be a mandatory task, ideally before and after the intervention, but both tasks are complex and unusual in the case of archaeological heritage. This study is mainly focused on analyzing changes in petrophysical properties of stone material from archaeological sites of Merida (Spain), evaluating, both on site and in laboratory, effects derived from different conservation treatments applied in past interventions, throughout the integration of different non-destructive techniques (NDT) and portable devices of analysis available at the Institute of Geosciences (CSIC,UCM). These techniques allow, not only assessment of effectiveness and alteration processes, but also monitoring durability of treatments, focused mainly on 1996 intervention in the case of Roman Theater, as well as different punctual interventions from the 90?s until date in the House of Mitreo. Studies carried out on archaeological sites of Merida permit us to compare outcomes and also check limitations in the use of those equipments. In this paper we discuss about the use of some techniques, their integration and limits, for the assessment of conservation treatments, showing some examples of Merida?s case study.
Resumo:
In recent decades archaeological sites have been subject of many interventions. The application of conservation treatments, such us consolidation and protection ones by means of using, for instance, synthetic resins or organosilicic compounds, has been demonstrated inadequate in many cases, and even harmful for the heritage materials [1]. Evaluation studies should be a mandatory task, ideally before and after the intervention, but both tasks are complex and unusual in the case of archaeological heritage. Moreover, there is a general lack of knowledge in the mid and long term effects of these treatments, and how to act when these have resulted in deterioration of the original material. Remains of Roman Augusta Emerita, located in Merida (Spain), have gone through many interventions since the first archaeological campaign, in 1910. Some of them have demonstrated already to be harmful [2], others, more recent, must be evaluated in order to determine its effectiveness and durability, considering that many of these treatments are currently still applied. For this purpose a range of parameters has been measured such as color, surface hardness and roughness, mechanical or hydric properties, porosity, etc. on the original material (granite, marble and mortars mainly), and then the transformations of those same parameters analyzed after treatment, both in situ, in places where a intervention is documented, and in the laboratory, in samples. The study is being conducted both in the laboratory (Petrophysics Laboratory within IGEO) and in situ, on selected archaeological sites of Mérida (Theater and House of Mitreo). The comparison of results in untreated and treated areas of the site, and in treated-untreated samples, allows the distinction of variables that affect the interaction between products and stone material, issues such us effectiveness and durability of treatment and its validation or dismissal.
Resumo:
The application of conservation treatments, such as consolidation and protection ones, has been demonstrated ineffective in many cases, and even harmful. Evaluation studies should be a mandatory task, ideally before and after the intervention, but both tasks are complex and unusual in the case of archaeological heritage. This study is mainly focused on analyzing changes in petrophysical properties of stone material from archaeological sites of Merida (Spain), evaluating, both on site and in laboratory, effects derived from different conservation treatments applied in past interventions, throughout the integration of different non-destructive techniques (NDT) and portable devices of analysis available at the Institute of Geosciences (CSIC,UCM). These techniques allow, not only assessment of effectiveness and alteration processes, but also monitoring durability of treatments, focused mainly on 1996 intervention in the case of Roman Theater, as well as different punctual interventions from the 90’s until date in the House of Mitreo. Studies carried out on archaeological sites of Merida permit us to compare outcomes and also check limitations in the use of those equipments. In this paper we discuss about the use of some techniques, their integration and limits, for the assessment of conservation treatments, showing some examples of Merida’s case study.
Resumo:
Each of the core histone proteins within the nucleosome has a central “structured” domain that comprises the spool onto which the DNA superhelix is wrapped and an N-terminal “tail” domain in which the structure and molecular interactions have not been rigorously defined. Recent studies have shown that the N-terminal domains of core histones probably contact both DNA and proteins within the nucleus and that these interactions play key roles in the regulation of nuclear processes (such as transcription and replication) and are critical in the formation of the chromatin fiber. An understanding of these complex mechanisms awaits identification of the DNA or protein sites within chromatin contacted by the tail domains. To this end, we have developed a site-specific histone protein–DNA photocross-linking method to identify the DNA binding sites of the N-terminal domains within chromatin complexes. With this approach, we demonstrate that the N-terminal tail of H2A binds DNA at two defined locations within isolated nucleosome cores centered around a position ≈40 bp from the nucleosomal dyad and that this tail probably adopts a defined structure when bound to DNA.