832 resultados para Place of Memory
Resumo:
The present study was performed to validate a spatial working memory task using pharmacological manipulations. The water escape T-maze, which combines the advantages of the Morris water maze and the T-maze while minimizes the disadvantages, was used. Scopolamine, a drug that affects cognitive function in spatial working memory tasks, significantly decreased the rat performance in the present delayed alternation task. Since glutamate neurotransmission plays an important role in the maintaining of working memory, we evaluated the effect of ionotropic and metabotropic glutamatergic receptors antagonists, administered alone or in combination, on rat behaviour. As the acquisition and performance of memory tasks has been linked to the expression of the immediately early gene cFos, a marker of neuronal activation, we also investigated the neurochemical correlates of the water escape T-maze after pharmacological treatment with glutamatergic antagonists, in various brain areas. Moreover, we focused our attention on the involvement of perirhinal cortex glutamatergic neurotransmission in the acquisition and/or consolidation of this particular task. The perirhinal cortex has strong and reciprocal connections with both specific cortical sensory areas and some memory-related structures, including the hippocampal formation and amygdala. For its peculiar position, perirhinal cortex has been recently regarded as a key region in working memory processes, in particular in providing temporary maintenance of information. The effect of perirhinal cortex lesions with ibotenic acid on the acquisition and consolidation of the water escape T-maze task was evaluated. In conclusion, our data suggest that the water escape T-maze could be considered a valid, simple and quite fast method to assess spatial working memory, sensible to pharmacological manipulations. Following execution of the task, we observed cFos expression in several brain regions. Furthermore, in accordance to literature, our results suggest that glutamatergic neurotransmission plays an important role in the acquisition and consolidation of working memory processes.
Resumo:
Purpose of this research is to deepen the study on the section in architecture. The survey aims as important elements in the project Teatro Domestico by Aldo Rossi built for the XVII Triennale di Milano in 1986 and, through the implementation on several topics of architecture, verify the timeliness and fertility in the new compositional exercises. Through the study of certain areas of the Rossi’s theory we tried to find a common thread for the reading of the theater project. The theater is the place of the ephemeral and the artificial, which is why his destiny is the end and the fatal loss. The design and construction of theater setting has always had a double meaning between the value of civil architecture and testing of new technologies available. Rossi's experience in this area are clear examples of the inseparable relationship between the representation of architecture as art and design of architecture as a model of reality. In the Teatro Domestico, the distinction between representation and the real world is constantly canceled and returned through the reversal of the meaning and through the skip of scale. At present, studies conducted on the work of Rossi concern the report that the architectural composition is the theory of form, focusing compositional development of a manufacturing process between the typological analysis and form invention. The research, through the analysis of some projects few designs, will try to analyze this issue through the rules of composition both graphical and concrete construction, hoping to decipher the mechanism underlying the invention. The almost total lack of published material on the project Teatro Domestico and the opportunity to visit the archives that preserve the drawings, has allowed the author of this study to deepen the internal issues in the project, thus placing this search as a first step toward possible further analysis on the works of Rossi linked to performance world. The final aim is therefore to produce material that can best describe the work of Rossi. Through the reading of the material published by the same author and the vision of unpublished material preserved in the archives, it was possible to develop new material and increasing knowledge about the work, otherwise difficult to analyze. The research is divided into two groups. The first, taking into account the close relationship most frequently mentioned by Rossi himself between archeology and architectural composition, stresses the importance of tipo such as urban composition reading system as well as open tool of invention. Resuming Ezio Bonfanti’s essay on the work of the architect we wanted to investigate how the paratactic method is applied to the early work conceived and, subsequently as the process reaches a complexity accentuated, while keeping stable the basic terms. Following a brief introduction related to the concept of the section and the different interpretations that over time the term had, we tried to identify with this facility a methodology for reading Rossi’s projects. The result is a constant typological interpretation of the term, not only related to the composition in plant but also through the elevation plans. The section is therefore intended as the overturning of such elevation is marked on the same plane of the terms used, there is a different approach, but a similarity of characters. The identification of architectural phonemes allows comparison with other arts. The research goes in the direction of language trying to identify the relationship between representation and construction, between the ephemeral and the real world. In this sense it will highlight the similarities between the graphic material produced by Ross and some important examples of contemporary author. The comparison between the composition system with the surrealist world of painting and literature will facilitate the understanding and identification of possible rules applied by Rossi. The second part of the research is characterized by a focus on the intent of the project chosen. Teatro Domestico embodies a number of elements that seem to conclude (assuming an end point but also to start) a curriculum author. With it, the experiments carried out on the theater started with the project for the Teatrino Scientifico (1978) through the project for the Teatro del Mondo (1979), into a Laic Tabernacle representative collective and private memory of the city. Starting from a reading of the draft, through the collection of published material, we’ve made an analysis on the explicit themes of the work, finding the conceptual references. Following the taking view of the original materials not published kept at Aldo Rossi's Archive Collection of the Canadian Center for Architecture in Montréal, will be implemented through the existing techniques for digital representation, a virtual reconstruction of the project, adding little to the material, a new element for future studies. The reconstruction is part of a larger research studies where the current technologies of composition and representation in architecture stand side by side with research on the method of composition of this architect. The results achieved are in addition to experiences in the past dealt with the reconstruction of some of the lost works of Aldo Rossi. A partial objective is to reactivate a discourse around this work is considered non-principal, among others born in the prolific activities. Reassessment of development projects which would bring the level of ephemeral works most frequented by giving them the value earned. In conclusion, the research aims to open a new field of interest on the part not only as a technical instrument of representation of an idea but as an actual mechanism through which composition is formed and the idea is developed.
Resumo:
Nowadays, it is clear that the target of creating a sustainable future for the next generations requires to re-think the industrial application of chemistry. It is also evident that more sustainable chemical processes may be economically convenient, in comparison with the conventional ones, because fewer by-products means lower costs for raw materials, for separation and for disposal treatments; but also it implies an increase of productivity and, as a consequence, smaller reactors can be used. In addition, an indirect gain could derive from the better public image of the company, marketing sustainable products or processes. In this context, oxidation reactions play a major role, being the tool for the production of huge quantities of chemical intermediates and specialties. Potentially, the impact of these productions on the environment could have been much worse than it is, if a continuous efforts hadn’t been spent to improve the technologies employed. Substantial technological innovations have driven the development of new catalytic systems, the improvement of reactions and process technologies, contributing to move the chemical industry in the direction of a more sustainable and ecological approach. The roadmap for the application of these concepts includes new synthetic strategies, alternative reactants, catalysts heterogenisation and innovative reactor configurations and process design. Actually, in order to implement all these ideas into real projects, the development of more efficient reactions is one primary target. Yield, selectivity and space-time yield are the right metrics for evaluating the reaction efficiency. In the case of catalytic selective oxidation, the control of selectivity has always been the principal issue, because the formation of total oxidation products (carbon oxides) is thermodynamically more favoured than the formation of the desired, partially oxidized compound. As a matter of fact, only in few oxidation reactions a total, or close to total, conversion is achieved, and usually the selectivity is limited by the formation of by-products or co-products, that often implies unfavourable process economics; moreover, sometimes the cost of the oxidant further penalizes the process. During my PhD work, I have investigated four reactions that are emblematic of the new approaches used in the chemical industry. In the Part A of my thesis, a new process aimed at a more sustainable production of menadione (vitamin K3) is described. The “greener” approach includes the use of hydrogen peroxide in place of chromate (from a stoichiometric oxidation to a catalytic oxidation), also avoiding the production of dangerous waste. Moreover, I have studied the possibility of using an heterogeneous catalytic system, able to efficiently activate hydrogen peroxide. Indeed, the overall process would be carried out in two different steps: the first is the methylation of 1-naphthol with methanol to yield 2-methyl-1-naphthol, the second one is the oxidation of the latter compound to menadione. The catalyst for this latter step, the reaction object of my investigation, consists of Nb2O5-SiO2 prepared with the sol-gel technique. The catalytic tests were first carried out under conditions that simulate the in-situ generation of hydrogen peroxide, that means using a low concentration of the oxidant. Then, experiments were carried out using higher hydrogen peroxide concentration. The study of the reaction mechanism was fundamental to get indications about the best operative conditions, and improve the selectivity to menadione. In the Part B, I explored the direct oxidation of benzene to phenol with hydrogen peroxide. The industrial process for phenol is the oxidation of cumene with oxygen, that also co-produces acetone. This can be considered a case of how economics could drive the sustainability issue; in fact, the new process allowing to obtain directly phenol, besides avoiding the co-production of acetone (a burden for phenol, because the market requirements for the two products are quite different), might be economically convenient with respect to the conventional process, if a high selectivity to phenol were obtained. Titanium silicalite-1 (TS-1) is the catalyst chosen for this reaction. Comparing the reactivity results obtained with some TS-1 samples having different chemical-physical properties, and analyzing in detail the effect of the more important reaction parameters, we could formulate some hypothesis concerning the reaction network and mechanism. Part C of my thesis deals with the hydroxylation of phenol to hydroquinone and catechol. This reaction is already industrially applied but, for economical reason, an improvement of the selectivity to the para di-hydroxilated compound and a decrease of the selectivity to the ortho isomer would be desirable. Also in this case, the catalyst used was the TS-1. The aim of my research was to find out a method to control the selectivity ratio between the two isomers, and finally to make the industrial process more flexible, in order to adapt the process performance in function of fluctuations of the market requirements. The reaction was carried out in both a batch stirred reactor and in a re-circulating fixed-bed reactor. In the first system, the effect of various reaction parameters on catalytic behaviour was investigated: type of solvent or co-solvent, and particle size. With the second reactor type, I investigated the possibility to use a continuous system, and the catalyst shaped in extrudates (instead of powder), in order to avoid the catalyst filtration step. Finally, part D deals with the study of a new process for the valorisation of glycerol, by means of transformation into valuable chemicals. This molecule is nowadays produced in big amount, being a co-product in biodiesel synthesis; therefore, it is considered a raw material from renewable resources (a bio-platform molecule). Initially, we tested the oxidation of glycerol in the liquid-phase, with hydrogen peroxide and TS-1. However, results achieved were not satisfactory. Then we investigated the gas-phase transformation of glycerol into acrylic acid, with the intermediate formation of acrolein; the latter can be obtained by dehydration of glycerol, and then can be oxidized into acrylic acid. Actually, the oxidation step from acrolein to acrylic acid is already optimized at an industrial level; therefore, we decided to investigate in depth the first step of the process. I studied the reactivity of heterogeneous acid catalysts based on sulphated zirconia. Tests were carried out both in aerobic and anaerobic conditions, in order to investigate the effect of oxygen on the catalyst deactivation rate (one main problem usually met in glycerol dehydration). Finally, I studied the reactivity of bifunctional systems, made of Keggin-type polyoxometalates, either alone or supported over sulphated zirconia, in this way combining the acid functionality (necessary for the dehydrative step) with the redox one (necessary for the oxidative step). In conclusion, during my PhD work I investigated reactions that apply the “green chemistry” rules and strategies; in particular, I studied new greener approaches for the synthesis of chemicals (Part A and Part B), the optimisation of reaction parameters to make the oxidation process more flexible (Part C), and the use of a bioplatform molecule for the synthesis of a chemical intermediate (Part D).
Resumo:
The development and characterization of biomolecule sensor formats based on the optical technique Surface Plasmon Resonance (SPR) Spectroscopy and electrochemical methods were investigated. The study can be divided into two parts of different scope. In the first part new novel detection schemes for labeled targets were developed on the basis of the investigations in Surface-plamon Field Enhanced Spectroscopy (SPFS). The first one is SPR fluorescence imaging formats, Surface-plamon Field Enhanced Fluorescence Microscopy (SPFM). Patterned self assembled monolayers (SAMs) were prepared and used to direct the spatial distribution of biomolecules immobilized on surfaces. Here the patterned monolayers would serve as molecular templates to secure different biomolecules to known locations on a surface. The binding processed of labeled target biomolecules from solution to sensor surface were visually and kinetically recorded by the fluorescence microscope, in which fluorescence was excited by the evanescent field of propagating plasmon surface polaritons. The second format which also originates from SPFS technique, Surface-plamon Field Enhanced Fluorescence Spectrometry (SPFSm), concerns the coupling of a fluorometry to normal SPR setup. A spectrograph mounted in place of photomultiplier or microscope can provide the information of fluorescence spectrum as well as fluorescence intensity. This study also firstly demonstrated the analytical combination of surface plasmon enhanced fluorescence detection with analyte tagged by semiconducting nano- crystals (QDs). Electrochemically addressable fabrication of DNA biosensor arrays in aqueous environment was also developed. An electrochemical method was introduced for the directed in-situ assembly of various specific oligonucleotide catcher probes onto different sensing elements of a multi-electrode array in the aqueous environment of a flow cell. Surface plasmon microscopy (SPM) is utilized for the on-line recording of the various functionalization steps. Hybridization reactions between targets from solution to the different surface-bound complementary probes are monitored by surface-plasmon field-enhanced fluorescence microscopy (SPFM) using targets that are either labeled with organic dyes or with semiconducting quantum dots for color-multiplexing. This study provides a new approach for the fabrication of (small) DNA arrays and the recording and quantitative evaluation of parallel hybridization reactions. In the second part of this work, the ideas of combining the SP optical and electrochemical characterization were extended to tethered bilayer lipid membrane (tBLM) format. Tethered bilayer lipid membranes provide a versatile model platform for the study of many membrane related processes. The thiolipids were firstly self-assembled on ultraflat gold substrates. Fusion of the monolayers with small unilamellar vesicles (SUVs) formed the distal layer and the membranes thus obtained have the sealing properties comparable to those of natural membranes. The fusion could be monitored optically by SPR as an increase in reflectivity (thickness) upon formation of the outer leaflet of the bilayer. With EIS, a drop in capacitance and a steady increase in resistance could be observed leading to a tightly sealing membrane with low leakage currents. The assembly of tBLMs and the subsequent incorporation of membrane proteins were investigated with respect to their potential use as a biosensing system. In the case of valinomycin the potassium transport mediated by the ion carrier could be shown by a decrease in resistance upon increasing potassium concentration. Potential mediation of membrane pores could be shown for the ion channel forming peptide alamethicin (Alm). It was shown that at high positive dc bias (cis negative) Alm channels stay at relatively low conductance levels and show higher permeability to potassium than to tetramethylammonium. The addition of inhibitor amiloride can partially block the Alm channels and results in increase of membrane resistance. tBLMs are robust and versatile model membrane architectures that can mimic certain properties of biological membranes. tBLMs with incorporated lipopolysaccharide (LPS) and lipid A mimicking bacteria membranes were used to probe the interactions of antibodies against LPS and to investigate the binding and incorporation of the small antimicrobial peptide V4. The influence of membrane composition and charge on the behavior of V4 was also probed. This study displays the possibility of using tBLM platform to record and valuate the efficiency or potency of numerous synthesized antimicrobial peptides as potential drug candidates.
Resumo:
People are daily faced with intertemporal choice, i.e., choices differing in the timing of their consequences, frequently preferring smaller-sooner rewards over larger-delayed ones, reflecting temporal discounting of the value of future outcomes. This dissertation addresses two main goals. New evidence about the neural bases of intertemporal choice is provided. Following the disruption of either the medial orbitofrontal cortex or the insula, the willingness to wait for larger-delayed outcomes is affected in odd directions, suggesting the causal involvement of these areas in regulating the value computation of rewards available with different timings. These findings were also supported by a reported imaging study. Moreover, this dissertation provides new evidence about how temporal discounting can be modulated at a behavioral level through different manipulations, e.g., allowing individuals to think about the distant time, pairing rewards with aversive events, or changing their perceived spatial position. A relationship between intertemporal choice, moral judgements and aging is also discussed. All these findings link together to support a unitary neural model of temporal discounting according to which signals coming from several cortical (i.e., medial orbitofrontal cortex, insula) and subcortical regions (i.e., amygdala, ventral striatum) are integrated to represent the subjective value of both earlier and later rewards, under the top-down regulation of dorsolateral prefrontal cortex. The present findings also support the idea that the process of outcome evaluation is strictly related to the ability to pre-experience and envision future events through self-projection, the anticipation of visceral feelings associated with receiving rewards, and the psychological distance from rewards. Furthermore, taking into account the emotions and the state of arousal at the time of decision seems necessary to understand impulsivity associated with preferring smaller-sooner goods in place of larger-later goods.
Resumo:
in the everyday clinical practice. Having this in mind, the choice of a simple setup would not be enough because, even if the setup is quick and simple, the instrumental assessment would still be in addition to the daily routine. The will to overcome this limit has led to the idea of instrumenting already existing and widely used functional tests. In this way the sensor based assessment becomes an integral part of the clinical assessment. Reliable and validated signal processing methods have been successfully implemented in Personal Health Systems based on smartphone technology. At the end of this research project there is evidence that such solution can really and easily used in clinical practice in both supervised and unsupervised settings. Smartphone based solution, together or in place of dedicated wearable sensing units, can truly become a pervasive and low-cost means for providing suitable testing solutions for quantitative movement analysis with a clear clinical value, ultimately providing enhanced balance and mobility support to an aging population.
Resumo:
Down syndrome (DS) is a genetic pathology characterized by brain hypotrophy and severe cognitive disability. Although defective neurogenesis is an important determinant of cognitive impairment, a severe dendritic pathology appears to be an equally important factor. It is well established that serotonin plays a pivotal role both on neurogenesis and dendritic maturation. Since the serotonergic system is profoundly altered in the DS brain, we wondered whether defects in the hippocampal development can be rescued by treatment with fluoxetine, a selective serotonin reuptake inhibitor and a widely used antidepressant drug. A previous study of our group showed that fluoxetine fully restores neurogenesis in the Ts65Dn mouse model of DS and that this effect is accompanied by a recovery of memory functions. The goal of the current study was to establish whether fluoxetine also restores dendritic development and maturation. In mice aged 45 days, treated with fluoxetine in the postnatal period P3-P15, we examined the dendritic arbor of newborn and mature granule cells of the dentate gyrus (DG). The granule cells of trisomic mice had a severely hypotrophic dendritic arbor, fewer spines and a reduced innervation than euploid mice. Treatment with fluoxetine fully restored all these defects. Moreover the impairment of excitatory and inhibitory inputs to CA3 pyramidal neurons was fully normalized in treated trisomic mice, indicating that fluoxetine can rescue functional connectivity between the DG and CA3. The widespread beneficial effects of fluoxetine on the hippocampal formation suggest that early treatment with fluoxetine can be a suitable therapy, possibly usable in humans, to restore the physiology of the hippocampal networks and, hence, memory functions. These findings may open the way for future clinical trials in children and adolescents with DS.
Resumo:
Monoclonal antibodies have emerged as one of the most promising therapeutics in oncology over the last decades. The generation of fully human tumorantigen-specific antibodies suitable for anti-tumor therapy is laborious and difficult to achieve. Autoreactive B cells expressing those antibodies are detectable in cancer patients and represent a suitable source for human antibodies. However, the isolation and cultivation of this cell type is challenging. A novel method was established to identify antigen-specific B cells. The method is based on the conversion of the antigen independent CD40 signal into an antigen-specific one. For that, the artificial fusion proteins ABCos1 and ABCos2 (Antigen-specific B cell co-stimulator) were generated, which consist of an extracellular association-domain derived from the constant region of the human immunoglobulin (Ig) G1, a transmembrane fragment and an intracellular signal transducer domain derived of the cytoplasmic domain of the human CD40 receptor. By the association with endogenous Ig molecules the heterodimeric complex allows the antigen-specific stimulation of both the BCR and CD40. In this work the ability of the ABCos constructs to associate with endogenous IgG molecules was shown. Moreover, crosslinking of ABCos stimulates the activation of NF-κB in HEK293-lucNifty and induces proliferation in B cells. The stimulation of ABCos in transfected B cells results in an activation pattern different from that induced by the conventional CD40 signal. ABCos activated B cells show a mainly IgG isotype specific activation of memory B cells and are characterized by high proliferation and the differentiation into plasma cells. To validate the approach a model system was conducted: B cells were transfected with IVT-RNA encoding for anti-Plac1 B cell receptor (antigen-specific BCR), ABCos or both. The stimulation with the BCR specific Plac1 peptide induces proliferation only in the cotransfected B cell population. Moreover, we tested the method in human IgG+ memory B cells from CMV infected blood donors, in which the stimulation of ABCos transfected B cells with a CMV peptide induces antigen-specific expansion. These findings show that challenging ABCos transfected B cells with a specific antigen results in the activation and expansion of antigen-specific B cells and not only allows the identification but also cultivation of these B cells. The described method will help to identify antigen-specific B cells and can be used to characterize (tumor) autoantigen-specific B cells and allows the generation of fully human antibodies that can be used as diagnostic tool as well as in cancer therapy.
Resumo:
The research work has dealt with the study of new catalytic processes for the synthesis of fine chemicals belonging to the class of phenolics, namely 2-phenoxyethanol and hydroxytyrosol. The two synthetic procedures investigated have the advantages of being much closer to the Green Chemistry principles than those currently used industrially. In both cases, the challenge was that of finding catalysts and methods which led to the production of less waste, and used less hazardous chemicals, safer solvents, and reusable heterogeneous catalysts. In the case of 2-phenoxyethanol, the process investigated involves the use of ethylene carbonate (EC) as the reactant for phenol O-hydroxyethylation, in place of ethylene oxide. Besides being a safer reactant, the major advantage of using EC in the new synthesis is the better selectivity to the desired product achieved. Moreover, the solid catalyst based on Na-mordenite was fully recyclable. The reaction mechanism and the effect of the Si/Al ratio in the mordenite were investigated. In the case of hydroxytyrosol, which is one of the most powerful natural antioxidants, a new synthetic procedure was investigated; in fact, the method currently employed, the hydrolysis of oleuropein, an ester extracted from the waste water processing of the olive, makes use of large amounts of organic solvents (hexane, ethyl acetate), and involves several expensive steps of purification. The synthesis procedure set up involves first the reaction between catechol and 2,2-dimethoxyacetaldehyde, followed by the one-pot reduction of the intermediate to give the desired product. Both steps were optimized, in terms of catalyst used, and of reaction conditions, that allowed to reach ca 70% yield in each step. The reaction mechanism was investigated and elucidated. During a 3-month period spent at the University of Valencia (with Prof. A. Corma’s group), a process for the production of diesel additives (2,5-bis(propoxymethyl)furan) from fructose has been investigated.
Resumo:
Die primäre, produktive Cytomegalovirus (CMV)-Infektion wird im immunkompetenten Patienten effizient durch antivirale CD8+ T-Zellen kontrolliert. Das virale Genom besitzt jedoch die Fähigkeit, in einem nicht replikativen, Latenz genannten Zustand, in gewissen Zelltypen zu persistieren, ohne dass infektiöse Nachkommenviren produziert werden. Die molekularen Mechanismen, welche der Etablierung und Aufrechterhaltung der Latenz zugrundeliegen, sind noch weitestgehend unbekannt. Es gibt Hinweise darauf, dass zelluläre Verteidigungsmechanismen die Zirkularisierung und Chromatinisierung viraler Genome hervorrufen und dadurch die virale Genexpression größtenteils verhindert wird (Marks & Spector, 1984; Reeves et al., 2006).rnAllerdings liegen die Genome nicht in einem komplett inaktiven Zustand vor. Vielmehr konnte für das murine CMV (mCMV) bereits die sporadische Transkription der Gene ie1 und ie2 während der Latenz nachgewiesen werden (Kurz et al., 1999; Grzimek et al., 2001).rnIn der vorliegenden Arbeit wurde zum ersten Mal eine umfassende in vivo Latenz-Analyse zur Charakterisierung der viralen Transkription in einer Kinetik anhand der alle drei kinetischen Klassen repräsentierenden Transkripte IE1, IE3, E1, m164, M105 und M86 vorgenommen.rnNach Latenz-Etablierung, verifiziert durch Abwesenheit von infektiösem Virus, konnten alle getesteten Transkripte in der Lunge quantifiziert werden. Interessanterweise war die transkriptionelle Aktivität zu keinem Analyse-Zeitpunkt mit der klassischen IE-E-L-Kinetik der produktiven Infektion kompatibel. Stattdessen lag eine stochastische Transkript-Expression vor, deren Aktivität mit voranschreitender Zeit immer weiter abnahm.rnWährend der Latenz exprimierte Transkripte, die für antigene Peptide kodieren, können infizierte Zellen für das Immunsystem sichtbar machen, was zu einer fortwährenden Restimulation des memory T-Zell-pools führen würde. Durch zeitgleiche Analyse der Transkript-Expression, sowie der Frequenzen Epitop-spezifischer CD8+ T-Zellen während der Latenz (IE1, m164, M105), wurde eine möglicher Zusammenhang zwischen der transkriptionellen Aktivität und der Expansion des memory T-Zell-pools untersucht. Die weitere Charakterisierung von Subpopulationen der Epitop-spezifischen CD8+ T-Zellen identifizierte die SLECs (short-lived-effector cells; CD127low CD62Llow KLRG1high) als die dominante Population in Lunge und Milz während der mCMV-Latenz.rnIn einem weiteren Teil der Arbeit sollte untersucht werden, ob IE-Genexpression zur Etablierung von Latenz notwendig ist. Mit Hilfe der Rekombinanten mCMV-Δie2-DTR, die die Gensequenz des Diphtherietoxin-Rezeptors (DTR) anstelle des Gens ie2 trägt, konnten infizierte, DTR exprimierende Zellen durch eine DT-Applikation konditional depletiert werden.rnIm latent infizierbaren Zelltyp der Leber, den LSECs (liver sinusoidal endothelial cells) wurde die virale Load durch 90-stündige DT–Applikation nach mCMV-Δie2-DTR Infektion auf das Level latent infizierter LSECs reduziert. Diese Daten sprechen für die Hypothese eines von Beginn an inaktiven Genoms, das keine IE-Genexpression zur Latenz-Etablierung benötigt. Zusätzlich stellt dieser Ansatz ein neues Tier-Modell zur Latenz-Etablierung dar. Verringerte Wartezeiten bis zur vollständigen Latenz-Etablierung, im Vergleich zum bisherigen Knochenmarktransplantations-Modell, könnten anfallende Tierhaltungskosten erheblich reduzieren und das Voranschreiten der Forschung beschleunigen.
Resumo:
Oceans are key sources and sinks in the global budgets of significant atmospheric trace gases, termed Volatile Organic Compounds (VOCs). Despite their low concentrations, these species have an important role in the atmosphere, influencing ozone photochemistry and aerosol physics. Surprisingly, little work has been done on assessing their emissions or transport mechanisms and rates between ocean and atmosphere, all of which are important when modelling the atmosphere accurately.rnA new Needle Trap Device (NTD) - GC-MS method was developed for the effective sampling and analysis of VOCs in seawater. Good repeatability (RSDs <16 %), linearity (R2 = 0.96 - 0.99) and limits of detection in the range of pM were obtained for DMS, isoprene, benzene, toluene, p-xylene, (+)-α-pinene and (-)-α-pinene. Laboratory evaluation and subsequent field application indicated that the proposed method can be used successfully in place of the more usually applied extraction techniques (P&T, SPME) to extend the suite of species typically measured in the ocean and improve detection limits. rnDuring a mesocosm CO2 enrichment study, DMS, isoprene and α-pinene were identified and quantified in seawater samples, using the above mentioned method. Based on correlations with available biological datasets, the effects of ocean acidification as well as possible ocean biological sources were investigated for all examined compounds. Future ocean's acidity was shown to decrease oceanic DMS production, possibly impact isoprene emissions but not affect the production of α-pinene. rnIn a separate activity, ocean - atmosphere interactions were simulated in a large scale wind-wave canal facility, in order to investigate the gas exchange process and its controlling mechanisms. Air-water exchange rates of 14 chemical species (of which 11 VOCs) spanning a wide range of solubility (dimensionless solubility, α = 0:4 to 5470) and diffusivity (Schmidt number in water, Scw = 594 to 1194) were obtained under various turbulent (wind speed at ten meters height, u10 = 0:8 to 15ms-1) and surfactant modulated (two different sized Triton X-100 layers) surface conditions. Reliable and reproducible total gas transfer velocities were obtained and the derived values and trends were comparable to previous investigations. Through this study, a much better and more comprehensive understanding of the gas exchange process was accomplished. The role of friction velocity, uw* and mean square slope, σs2 in defining phenomena such as waves and wave breaking, near surface turbulence, bubbles and surface films was recognized as very significant. uw* was determined as the ideal turbulent parameter while σs2 described best the related surface conditions. A combination of both uw* and σs2 variables, was found to reproduce faithfully the air-water gas exchange process. rnA Total Transfer Velocity (TTV) model provided by a compilation of 14 tracers and a combination of both uw* and σs2 parameters, is proposed for the first time. Through the proposed TTV parameterization, a new physical perspective is presented which provides an accurate TTV for any tracer within the examined solubility range. rnThe development of such a comprehensive air-sea gas exchange parameterization represents a highly useful tool for regional and global models, providing accurate total transfer velocity estimations for any tracer and any sea-surface status, simplifying the calculation process and eliminating inevitable calculation uncertainty connected with the selection or combination of different parameterizations.rnrn
Resumo:
Two experiments investigated the structure of memory for titles of 54 familiar tunes. The titles were presented in the form of a hierarchy, with nodes labeled by genre (e.g., Rock or Patriotic). Four groups of subjects received logical or randomized titles, and logical or randomized labels. Goodness of label and title structure had equal and additive beneficial effects on recall with a 3-min exposure of the stimuli. With a 4-min exposure, good title structure became a larger contributor to good recall. Clustering analyses suggested that subjects were mentally representing the tune titles hierarchically, even when presentation was random.
Resumo:
This paper provides an analysis of the key term aidagara (“betweenness”) in the philosophical ethics of Watsuji Tetsurō (1889-1960), in response to and in light of the recent movement in Japanese Buddhist studies known as “Critical Buddhism.” The Critical Buddhist call for a turn away from “topical” or intuitionist thinking and towards (properly Buddhist) “critical” thinking, while problematic in its bipolarity, raises the important issue of the place of “reason” versus “intuition” in Japanese Buddhist ethics. In this paper, a comparison of Watsuji’s “ontological quest” with that of Martin Heidegger (1889-1976), Watsuji’s primary Western source and foil, is followed by an evaluation of a corresponding search for an “ontology of social existence” undertaken by Tanabe Hajime (1885-1962). Ultimately, the philosophico-religious writings of Watsuji Tetsurō allow for the “return” of aesthesis as a modality of social being that is truly dimensionalized, and thus falls prey neither to the verticality of topicalism nor the limiting objectivity of criticalism.
Resumo:
Reform is a word that, one might easily say, characterizes more than any other the history and development of Buddhism. Yet, it must also be said that reform movements in East Asian Buddhism have often taken on another goal—harmony or unification; that is, a desire not only to reconstruct a more worthy form of Buddhism, but to simultaneously bring together all existing forms under a single banner, in theory if not in practice. This paper explores some of the tensions between the desire for reform and the quest for harmony in modern Japanese Buddhism thought, by comparing two developments: the late 19th century movement towards ‘New Buddhism’ (shin Bukkyō) as exemplified by Murakami Senshō 村上専精 (1851–1929), and the late 20th century movement known as ‘Critical Buddhism’ (hihan Bukkyō), as found in the works of Matsumoto Shirō 松本史朗 and Hakamaya Noriaki 袴谷憲昭. In all that has been written about Critical Buddhism, in both Japanese and English, very little attention has been paid to the place of the movement within the larger traditions of Japanese Buddhist reform. Here I reconsider Critical Buddhism in relation to the concerns of the previous, much larger trends towards Buddhist reform that emerged almost exactly 100 years previous—the so-called shin Bukkyō or New Buddhism of the late-Meiji era. Shin Bukkyō is a catch-all term that includes the various writings and activities of Inoue Enryō, Shaku Sōen, and Kiyozawa Manshi, as well as the so-called Daijō-hibussetsuron, a broad term used (often critically) to describe Buddhist writers who suggested that Mahāyāna Buddhism is not, in fact, the Buddhism taught by the ‘historical’ Buddha Śākyamuni. Of these, I will make a few general remarks about Daijō-hibusseturon, before turning attention more specifically to the work of Murakami Senshō, in order to flesh out some of the similarities and differences between his attempt to construct a ‘unified Buddhism’ and the work of his late-20th century avatars, the Critical Buddhists. Though a number of their aims and ideas overlap, I argue that there remain fundamental differences with respect to the ultimate purposes of Buddhist reform. This issue hinges on the implications of key terms such as ‘unity’ and ‘harmony’ as well as the way doctrinal history is categorized and understood, but it also relates to issues of ideology and the use and abuse of Buddhist doctrines in 20th-century politics.
Resumo:
I am truly honored to have been given the amazing opportunity to create this original piece, this powerful journey through memory and emotive exploration of the loss of childhood. How do we feel about the loss of our child-self? Could we ever get them back? How long, how deep would one have to dig in the graveyards, the playgrounds of memory, to uncover what was buried there... to un-erase what waserased? shading silhouettes of smaller ones will ultimately encourage a reconnection with the Inner Child hidden inside all of us, as well as an intimate awareness of the adult version of the self by looking back to the smaller ones. The main inspiration for this piece is then of course, Inner Child Work. Most people may not be familiar with this therapeutic exploration of childhood... It wasimportant to me then, to present this concept in an imaginative, theatrical way, as a gift to you - a comprehensive and intensely moving gift. Speaking from experience, working on my Inner Child - my little Bianca - has been the most painful, frightening, yetrewarding and powerful experience within my personal life. Some people spend their entire lives trying to love themselves, to prove themselves, or be accepted. Some are too afraid to look back to where it all began. The characters within this piece will face thatfear... in a regression from the complexities of adulthood to the confusion of adolescence, all the way back to the wonder and bliss of childhood. They will reveal memories, of both joy and pain, love and abandonment, journeying backwards through time - through memory - through a playground - back to the beginning... We will enter a world where a push of a merry-go-round spins us to games of Truth or Dare after a high school dance at 16 - or the slam of a metal fence reminds us of the door Dad slammed in our face at 9 - where the sound of chain links swings us back to scrapping our knee by the sandbox at 5 This piece will attempt to connect everyone, both cast and audience, through a universal understanding and discussion of what it means to grow up, as well as a discovery of WHY we are the way we are - how experiences or relationships from our childhood have shaped our adult lives. We will attempt to challenge your honesty and nerve by inviting you to ask questions of yourselves, your past - to remember what it's like to have the innocence and hope of a child, to engage with and discover your Inner Child, to realize when or why you left them behind, and if you want to this magical part of yourself. It is my hope that you will join us in a collective journey - gather the courage to dig up the little kid you buried so long ago...* The creation, design, choreography, and direction for shading silhouettes of smaller ones mark the culminating experience of a year-long independent study in Theatre.