9 resultados para Historic conscience. Country of Mossoró . Memory. Spatiality.
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The development of vaccines directed against polysaccharide capsules of S. pneumoniae, H. influenzae and N. meningitidis have been of great importance in preventing potentially fatal infections. Bacterial capsular polysaccharides are T-cell-independent antigens that induce specific antibody response characterized by IgM immunoglobulins, with a very low IgG class switched response and lack of capability of inducing a booster response. The inability of pure polysaccharides to induce sustained immune responses has required the development of vaccines containing polysaccharides conjugated to a carrier protein, with the aim to generate T cell help. It is clear that the immunogenicity of glycoconjugate vaccines can vary depending on different factors, e.g. chemical nature of the linked polysaccharide, carrier protein, age of the target population, adjuvant used. The present study analyzes the memory B cell (MBC) response to the polysaccharide and to the carrier protein following vaccination with a glycoconjugate vaccine for the prevention of Group B streptococcus (GBS) infection. Not much is known about the role of adjuvants in the development of immunological memory raised against GBS polysaccharides, as well as about the influence of having a pre-existing immunity against the carrier protein on the B cell response raised against the polysaccharide component of the vaccine. We demonstrate in the mouse model that adjuvants can increase the antibody and memory B cell response to the carrier protein and to the conjugated polysaccharide. We also demonstrate that a pre-existing immunity to the carrier protein favors the development of the antibody and memory B cell response to subsequent vaccinations with a glycoconjugate, even in absence of adjuvants. These data provide a useful insight for a better understanding of the mechanism of action of this class of vaccines and for designing the best vaccine that could result in a productive and long lasting memory response.
Resumo:
Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the pure software version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.
Resumo:
The country-of-origin is the nationality of a food when it goes through customs in a foreign country, and is a brand when the food is for sale in a foreign market. My research on country-of-origin labeling (COOL) started from a case study on the extra virgin olive oil exported from Italy to China; the result shows that asymmetric and imperfect origin information may lead to market inefficiency, even market failure in emerging countries. Then, I used the Delphi method to conduct qualitative and systematic research on COOL; the panel of experts in food labeling and food policy was composed of 19 members in 13 countries; the most important consensus is that multiple countries of origin marking can provide accurate information about the origin of a food produced by two or more countries, avoiding misinformation for consumers. Moreover, I enhanced the research on COOL by analyzing the rules of origin and drafting a guideline for the standardization of origin marking. Finally, from the perspective of information economics I estimated the potential effect of the multiple countries of origin labeling on the business models of international trade, and analyzed the regulatory options for mandatory or voluntary COOL of main ingredients. This research provides valuable insights for the formulation of COOL policy.
Resumo:
Phenomenology is a critical component of autobiographical memory retrieval. Some memories are vivid and rich in sensory details whereas others are faded; some memories are experienced as emotionally intense whereas others are not. Sutin and Robins (2007) identified 10 dimensions in which a memory may varyi.e., Vividness, Coherence, Accessibility, Sensory Details, Emotional Intensity, Visual Perspective, Time Perspective, Sharing, Distancing, and Valenceand developed a comprehensive psychometrically sound measure of memory phenomenology, the Memory Experiences Questionnaire (MEQ). Phenomenology has been linked to underlining stable dispositionsi.e. personality, as well as to a variety of positive/negative psychological outcomeswell-being and life satisfaction, depression and anxiety, among others. Using the MEQ, a cross-sectional and a longitudinal study were conducted on a large sample of American and Italian adults. In both studies, participants retrieved two key personal memories, a Turning Point and a Childhood Memory, and rated the affect and phenomenology of each memory. Participants also completed self-reported measures of personality (i.e. Neuroticism and Conscientiousness), and measures of depression, well-being and life satisfaction. The present research showed that phenomenological ratings tend (a) to cross-sectionally increase across adulthood (Study 1), and (b) to be moderately stable over time, regardless the contents of the memories (Study 2). Interrelations among memory phenomenology, personality and psychological outcome variables were also examined (Study 1 and Study 2). In particular, autobiographical memory phenomenology was proposed as a dynamic expression of personality functioning that partially explains adaptive/maladaptive psychological outcomes. In fact, the findings partially supported the hypothesized mediating effect of phenomenology on the personality association with psychological outcomes. Implications of the findings are discussed proposing future lines of research. In particular, the need for more longitudinal studies is highlighted, along with the combined application of both self-report questionnaires and narrative measures.
Resumo:
Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still far from accessible from directly configure and offer network services and need to be enriched with more user oriented functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.
Resumo:
The present study was performed to validate a spatial working memory task using pharmacological manipulations. The water escape T-maze, which combines the advantages of the Morris water maze and the T-maze while minimizes the disadvantages, was used. Scopolamine, a drug that affects cognitive function in spatial working memory tasks, significantly decreased the rat performance in the present delayed alternation task. Since glutamate neurotransmission plays an important role in the maintaining of working memory, we evaluated the effect of ionotropic and metabotropic glutamatergic receptors antagonists, administered alone or in combination, on rat behaviour. As the acquisition and performance of memory tasks has been linked to the expression of the immediately early gene cFos, a marker of neuronal activation, we also investigated the neurochemical correlates of the water escape T-maze after pharmacological treatment with glutamatergic antagonists, in various brain areas. Moreover, we focused our attention on the involvement of perirhinal cortex glutamatergic neurotransmission in the acquisition and/or consolidation of this particular task. The perirhinal cortex has strong and reciprocal connections with both specific cortical sensory areas and some memory-related structures, including the hippocampal formation and amygdala. For its peculiar position, perirhinal cortex has been recently regarded as a key region in working memory processes, in particular in providing temporary maintenance of information. The effect of perirhinal cortex lesions with ibotenic acid on the acquisition and consolidation of the water escape T-maze task was evaluated. In conclusion, our data suggest that the water escape T-maze could be considered a valid, simple and quite fast method to assess spatial working memory, sensible to pharmacological manipulations. Following execution of the task, we observed cFos expression in several brain regions. Furthermore, in accordance to literature, our results suggest that glutamatergic neurotransmission plays an important role in the acquisition and consolidation of working memory processes.
Resumo:
Entre les annes 1950 et 1980, merge une nouvelle forme de labyrinthe chez des romanciers europens comme Michel Butor, Alain Robbe-Grillet, Italo Calvino, Patrick Modiano et Alasdair Gray : un labyrinthe insaisissable et non cartographiable. Pour en rendre compte nous avons recours au modle du rhizome, issu de la philosophie de Gilles Deleuze et de Flix Guattari, aussi bien qu'au concept d'htrotopie de Michel Foucault. La spatialit de nos romans nous pousse prendre en compte galement les rcritures ironiques du mythe de Thse, Ariane, le Minotaure, Ddale. Les citations et les allusions au mythe nous font remarquer la distance d'avec le modle traditionnel et les effets de ce qu'on peut considrer comme un bricolage mythique , dans le cadre d'un regard ironique, parodique ou satirique. La reprsentation romanesque du labyrinthe accentue d'un ct l'absence d'un centre, et de l'autre ct l'ouverture extrme de cet espace qu'est la ville contemporaine. En mme temps, la prsence de nombreux espaces autres , les htrotopies de Foucault, dfinit l'garement des protagonistes des romans. Au fur et mesure que les crivains acquirent conscience des caractristiques labyrinthiques de ces espaces, celles-ci commencent informer l'uvre romanesque, crant ainsi un espace mtafictionnel. Entre les annes Cinquante et le dbut des annes Soixante-dix, les Nouveaux romanciers franais accentuent ainsi l'ide de pouvoir jouer avec les instruments de la fiction, pour exasprer l'absence d'un sens dans la ville comme dans la pratique de l'criture. Calvino reformule cette conception du roman, remarquant l'importance d'un sens, mme s'il est cach et difficile saisir. Pour cette raison, la fin de l'poque que nous analysons, des auteurs comme Modiano et Gray absorbent les techniques d'criture de ces prdcesseurs, en les faisant jouer avec la responsabilit thique de l'auteur.
Resumo:
Oggetto della ricerca il museo Wilhelm Lehmbruck di Duisburg, un'opera dell'architetto Manfred Lehmbruck, progettata e realizzata tra il 1957 e il 1964. Questa architettura, che ospita la produzione artistica del noto scultore Wilhelm Lehmbruck, padre di Manfred, tra i primi musei edificati ex novo nella Repubblica Federale Tedesca dopo la seconda guerra mondiale. Il mito di Wilhelm Lehmbruck, costruito negli anni per donare una identit culturale alla citt industriale di Duisburg, si rinvigor nel secondo dopoguerra in seno ad una pi generale tendenza sorta nella Repubblica di Bonn verso la rivalutazione dell'arte moderna, dichiarata degenerata dal nazionalsocialismo. Ricollegarsi all'arte e all'architettura moderna degli anni venti era in quel momento funzionale al ridisegno di un volto nuovo e democratico del giovane stato tedesco, che cercava legittimazione proclamandosi erede della mitica e gloriosa Repubblica di Weimar. Dopo anni di dibattiti sulla ricostruzione, l'architettura del neues Bauen sembrava l'unico modo in cui la Repubblica Federale potesse presentarsi al mondo, anche se la realt del paese era assai pi complessa e svelava il doppio volto che connot questo stato a partire dal 1945. Le numerose dicotomie che popolarono presto la tabula rasa nata dalle ceneri del conflitto (memoria/oblio, tradizione/modernit, continuit/discontinuit con il recente e infausto passato) trovano espressione nella storia e nella particolare architettura del museo di Duisburg, che pu essere quindi interpretato come un'opera paradigmatica per comprendere la nuova identit della Repubblica Federale, un'identit che la rese capace di risorgere dopo l' anno zero, ricercando nel miracolo economico uno strumento di redenzione da un passato vergognoso, che doveva essere taciuto, dimenticato, lasciato alle spalle.
Resumo:
Shape memory materials (SMMs) represent an important class of smart materials that have the ability to return from a deformed state to their original shape. Thanks to such a property, SMMs are utilized in a wide range of innovative applications. The increasing number of applications and the consequent involvement of industrial players in the field have motivated researchers to formulate constitutive models able to catch the complex behavior of these materials and to develop robust computational tools for design purposes. Such a research field is still under progress, especially in the prediction of shape memory polymer (SMP) behavior and of important effects characterizing shape memory alloy (SMA) applications. Moreover, the frequent use of shape memory and metallic materials in biomedical devices, particularly in cardiovascular stents, implanted in the human body and experiencing millions of in-vivo cycles by the blood pressure, clearly indicates the need for a deeper understanding of fatigue/fracture failure in microsize components. The development of reliable stent designs against fatigue is still an open subject in scientific literature. Motivated by the described framework, the thesis focuses on several research issues involving the advanced constitutive, numerical and fatigue modeling of elastoplastic and shape memory materials. Starting from the constitutive modeling, the thesis proposes to develop refined phenomenological models for reliable SMA and SMP behavior descriptions. Then, concerning the numerical modeling, the thesis proposes to implement the models into numerical software by developing implicit/explicit time-integration algorithms, to guarantee robust computational tools for practical purposes. The described modeling activities are completed by experimental investigations on SMA actuator springs and polyethylene polymers. Finally, regarding the fatigue modeling, the thesis proposes the introduction of a general computational approach for the fatigue-life assessment of a classical stent design, in order to exploit computer-based simulations to prevent failures and modify design, without testing numerous devices.